AI Translates Imagined Speech into Spoken Words for Paralysis Patients
Science 6 days ago
A new brain-computer interface (BCI) developed by Stanford researchers allows individuals with paralysis to communicate by simply imagining speech. Unlike older methods requiring physical attempts at speaking, this system detects neural activity linked to silent thoughts, reducing effort for users. The team trained an AI model to decode these signals using a 125,000-word vocabulary, achieving 74% accuracy in early tests.
The system includes a privacy feature activated by the mental ’password’ Chitty Chitty Bang Bang, detected with 98% accuracy. Participants—all with ALS or brainstem strokes—preferred this method for its speed and ease. However, challenges remain in distinguishing intended speech from private thoughts, raising ethical considerations.
While the technology is still evolving, researchers emphasize its potential to restore communication for those with severe paralysis. Federal regulations ensure adherence to medical ethics, addressing concerns about unintended ’mind-reading.’ Future improvements aim to enhance accuracy and refine the boundary between shared and private thoughts.