Mind-Reading Technology Has Arrived



Thanks to Sigal Samuel for this article:: For a few years now, I’ve been writing articles on neurotechnology with downright Orwellian headlines. Headlines that warn “Facebook is building tech to read your mind” and “Brain-reading tech is coming.” Well, the technology is no longer just “coming.” It’s here. With the help of AI, scientists from the University of Texas at Austin have developed a technique that can translate people’s brain activity — like the unspoken thoughts swirling through our minds — into actual speech, according to a study published in Nature.

In the past, researchers have shown that they can decode unspoken language by implanting electrodes in the brain and then using an algorithm that reads the brain’s activity and translates it into text on a computer screen. But that approach is very invasive, requiring surgery. It appealed only to a subset of patients, like those with paralysis, for whom the benefits were worth the costs. So researchers also developed techniques that didn’t involve surgical implants. They were good enough to decode basic brain states, like fatigue, or very short phrases — but not much more.

Now we’ve got a non-invasive brain-computer interface (BCI) that can decode continuous language from the brain, so somebody else can read the general gist of what we’re thinking even if we haven’t uttered a single word. How is that possible? It comes down to the marriage of two technologies: fMRI scans, which measure blood flow to different areas of the brain, and large AI language models, similar to the now-infamous ChatGPT.

In the University of Texas study, three participants listened to 16 hours of storytelling podcasts like The Moth while scientists used an fMRI machine to track the change in blood flow in their brains. That data allowed the scientists, using an AI model, to associate a phrase with how each person’s brain looks when it hears that specific phrase.

Acknowledgement and thanks to:: Sigal Samuel | Vox
May 8, 2023