The Brain Implant That Listens Only When You Say the Secret Word

Can a single thought phrase be the secret to opening and protecting a mind-reading machine? In a new paper published in Cell, scientists have shown a brain–computer interface (BCI) that can accurately decode an individual’s inner speech with 74% accuracy, but only when the user mentally whispers a predetermined password. Without that mental “wake word,” the system is dormant, a protection against the unwanted decoding of personal thoughts.

Image Credit to eurekalert.org

The research, conducted by Erin Kunz at Stanford University, continues to advance on decades of work in neural speech decoding. In contrast to previous BCIs, which asked users to try speaking out loud a draining activity for numerous individuals with paralysis or motor neuron disease the system aims at inner speech, silent language-based mind phenomena that arise. If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people, co-lead author Benyamin Meschede-Krasa said.

To record these fleeting signals, neurosurgeons inserted microelectrode arrays each of which is smaller than a baby aspirin into four subjects’ motor cortex. That area of the brain is responsible for voluntary movement but also controls the precise fine motor commands necessary for the articulation of speech. Their neural recordings indicated that attempted and imagined speech engaged similar overlapping cortex and elicited the same spatiotemporal patterns, although inner speech signals were quantifiably weaker.

Decoding started with the most basic units of language. The researchers trained AI models to identify the phonemes from high-res neural data, then employed large-vocabulary language models to build them into words and sentences in real time. The vocabulary ranged over 125,000 words, and the system reached its highest 74% sentence-level accuracy when participants were told to focus on particular phrases.

The password security system borrows concepts from neural keyword spotting, a method developed through inspiration by voice assistants such as Siri or Alexa. Rather than acoustic features, the BCI picks up on a specific neural signature the thought that was “chitty chitty bang bang” in this example before engaging the decoder. The mental stimulus was identified with more than 98% accuracy, successfully blocking accidental “leakage” of uncontrolled thoughts.

Signal processing was essential for this performance. Activity in the high-gamma band (70–110 Hz), previously shown to represent fine articulatory movement, was derived from the motor cortex signals. These features together with temporal pattern analysis enabled the AI to differentiate between phonemes even under the low signal-to-noise ratio that accompanies imagined speech. The same techniques have been applied in high-density electrocorticography to obtain fine decoding of consonant-vowel syllables with millisecond accuracy.

From the engineering point of view, the research combines several advancements: invasive intracortical recording for high temporal and spatial resolution, machine learning pipelines that are optimized for phoneme classification, and real-time sentence reconstruction via language modeling. The password layer provides an additional privacy-preserving control channel something that has been widely debated in BCI ethics but not put into practice with functional systems.

The privacy consequences are real. As Frank Willett, senior author and co-director of Stanford’s Neural Prosthetics Translational Laboratory, explained, “The existence of inner speech in motor regions of the brain raises the possibility that it could accidentally ‘leak out.” By needing an explicit, low-likelihood mental command to start decoding, the system works to mitigate one of the biggest anxieties in neural interface design: protecting the privacy of thought.

Later versions will likely extend beyond the motor cortex to areas like the superior temporal gyrus, which might support more detailed representations of speech content imagined. Advances in hardware implantable, wireless arrays that record from more neurons are also in the offing, offering better accuracy and wearer comfort. For the time being, the work shows that a BCI can not only turn thoughts into words when they are in silence but do so under the direct control of the user, a step toward useful, safe communication devices for the inarticulate.

spot_img

More from this stream

Recomended

Discover more from Modern Engineering Marvels

Subscribe now to keep reading and get access to the full archive.

Continue reading