New brain-computer interface allows man with ALS to ‘speak’ again (2024)

Neurological HealthAugust 14, 2024

By Nadine A Yehya

(SACRAMENTO)

A new brain-computer interface (BCI) developed at UC Davis Health translates brain signals into speech with up to 97% accuracy — the most accurate system of its kind.

The researchers implanted sensors in the brain of a man with severely impaired speech due to amyotrophic lateral sclerosis (ALS). The man was able to communicate his intended speech within minutes of activating the system.

A study about this work was published today in the New England Journal of Medicine.

New brain-computer interface allows man with ALS to ‘speak’ again (1)

ALS, also known as Lou Gehrig's disease, affects the nerve cells that control movement throughout the body. The disease leads to a gradual loss of the ability to stand, walk and use one’s hands. It can also cause a person to lose control of the muscles used to speak, leading to a loss of understandable speech.

The new technology is being developed to restore communication for people who can’t speak due to paralysis or neurological conditions like ALS. It can interpret brain signals when the user tries to speak and turns them into text that is ‘spoken’ aloud by the computer.

“Our BCI technology helped a man with paralysis to communicate with friends, families and caregivers,” said UC Davis neurosurgeon David Brandman. “Our paper demonstrates the most accurate speech neuroprosthesis (device) ever reported.”

Brandman is the co-principal investigator and co-senior author of this study. He is an assistant professor in the UC Davis Department of Neurological Surgery and co-director of the UC Davis Neuroprosthetics Lab.

The new BCI breaks the communication barrier

When someone tries to speak, the new BCI device transforms their brain activity into text on a computer screen. The computer can then read the text out loud.

To develop the system, the team enrolled Casey Harrell, a 45-year-old man with ALS, in the BrainGate clinical trial. At the time of his enrollment, Harrell had weakness in his arms and legs (tetraparesis). His speech was very hard to understand (dysarthria) and required others to help interpret for him.

In July 2023, Brandman implanted the investigational BCI device. He placed four microelectrode arrays into the left precentral gyrus, a brain region responsible for coordinating speech. The arrays are designed to record the brain activity from 256 cortical electrodes.

“We’re really detecting their attempt to move their muscles and talk,” explained neuroscientist Sergey Stavisky. Stavisky is an assistant professor in the Department of Neurological Surgery. He is the co-director of the UC Davis Neuroprosthetics Lab and co-principal investigator of the study. “We are recording from the part of the brain that’s trying to send these commands to the muscles. And we are basically listening into that, and we’re translating those patterns of brain activity into a phoneme — like a syllable or the unit of speech — and then the words they’re trying to say.”

New brain-computer interface allows man with ALS to ‘speak’ again (2)

Faster training, better results

Despite recent advances in BCI technology, efforts to enable communication have been slow and prone to errors. This is because the machine-learning programs that interpreted brain signals required a large amount of time and data to perform.

“Previous speech BCI systems had frequent word errors. This made it difficult for the user to be understood consistently and was a barrier to communication,” Brandman explained. “Our objective was to develop a system that empowered someone to be understood whenever they wanted to speak.”

Harrell used the system in both prompted and spontaneous conversational settings. In both cases, speech decoding happened in real time, with continuous system updates to keep it working accurately.

The decoded words were shown on a screen. Amazingly, they were read aloud in a voice that sounded like Harrell’s before he had ALS. The voice was composed using software trained with existing audio samples of his pre-ALS voice.

Watch Video

New Brain-Computer Interface Allows Man with ALS to 'Speak’ Again

At the first speech data training session, the system took 30 minutes to achieve 99.6% word accuracy with a 50-word vocabulary.

“The first time we tried the system, he cried with joy as the words he was trying to say correctly appeared on-screen. We all did,” Stavisky said.

In the second session, the size of the potential vocabulary increased to 125,000 words. With just an additional 1.4 hours of training data, the BCI achieved a 90.2% word accuracy with this greatly expanded vocabulary. After continued data collection, the BCI has maintained 97.5% accuracy.

New brain-computer interface allows man with ALS to ‘speak’ again (5)

The first time we tried the system, he cried with joy as the words he was trying to say correctly appeared on-screen. We all did.” neuroscientist Sergey Stavisky

“At this point, we can decode what Casey is trying to say correctly about 97% of the time, which is better than many commercially available smartphone applications that try to interpret a person’s voice,” Brandman said. “This technology is transformative because it provides hope for people who want to speak but can’t. I hope that technology like this speech BCI will help future patients speak with their family and friends.”

The study reports on 84 data collection sessions over 32 weeks. In total, Harrell used the speech BCI in self-paced conversations for over 248 hours to communicate in person and over video chat.

New brain-computer interface allows man with ALS to ‘speak’ again (6)

This technology is transformative because it provides hope for people who want to speak but can’t.” neurosurgeon David Brandman

“Not being able to communicate is so frustrating and demoralizing. It is like you are trapped,” Harrell said. “Something like this technology will help people back into life and society.”

“It has been immensely rewarding to see Casey regain his ability to speak with his family and friends through this technology,” said the study’s lead author, Nicholas Card. Card is a postdoctoral scholar in the UC Davis Department of Neurological Surgery.

New brain-computer interface allows man with ALS to ‘speak’ again (7)

Not being able to communicate is so frustrating and demoralizing. It is like you are trapped. Something like this technology will help people back into life and society.” Casey Harrell, patient with ALS and a participant in the BrainGate2 BCI trial

“Casey and our other BrainGate participants are truly extraordinary. They deserve tremendous credit for joining these early clinical trials. They do this not because they’re hoping to gain any personal benefit, but to help us develop a system that will restore communication and mobility for other people with paralysis,” said co-author and BrainGate trial sponsor-investigator Leigh Hochberg. Hochberg is a neurologist and neuroscientist at Massachusetts General Hospital, Brown University and the VA Providence Healthcare System.

Brandman is the site-responsible principal investigator of the BrainGate2 clinical trial. The trial is enrolling participants. To learn more about the study, visit braingate.org or contact braingate@ucdavis.edu.

A complete list of coauthors and funders is available in the article.

Caution: Investigational device. Limited by Federal law to investigational use.

Additional readings

  • Read the study
  • Neurological surgery researcher Sergey Stavisky and team awarded $3.5 million in grants
  • Clinical Trial Aims to Develop New Methods to Restore Speech with Brain-Computer Interface
  • Video
New brain-computer interface allows man with ALS to ‘speak’ again (2024)

FAQs

New brain-computer interface allows man with ALS to ‘speak’ again? ›

New Brain-Computer Interface Allows Man with ALS to 'Speak' Again. At the first speech data training session, the system took 30 minutes to achieve 99.6% word accuracy with a 50-word vocabulary. “The first time we tried the system, he cried with joy as the words he was trying to say correctly appeared on-screen.

Is the brain computer interface real? ›

Brain-computer interfaces are devices that process brain activity and send signals to external software, allowing a user to control devices with their thoughts. With BCI technology, scientists envision a day when patients with paralysis, muscle atrophy and other conditions could regain motor functions.

How much does a brain computer interface cost? ›

Mindrove arc - EEG-based brain-computer interface at Rs 112000/piece | EEG Device in Ahmedabad | ID: 2849111242173.

What is another name for the brain computer interface? ›

A brain–computer interface (BCI), sometimes called a brain–machine interface (BMI), is a direct communication link between the brain's electrical activity and an external device, most commonly a computer or robotic limb.

Did Stephen Hawking use a brain computer interface? ›

Share. Not even Stephen Hawking used the kind of sci-fi communication interface that University of Kansas neuroscientist Jonathan Brumberg is developing. Hawking used a cheek muscle to control his voice device.

What are the side effects of brain computer interface? ›

The implantation of invasive BCI devices carries inherent medical risks like infection or brain tissue damage. Even non-invasive techniques may pose health concerns, such as the effects of long-term exposure to electromagnetic fields. BCIs have also be shown to lead to high cognitive fatigue.

How close are we to the brain computer interface? ›

Amid ongoing human clinical trials, there is still a long way to go before neural chips are commonplace in clinics. It will potentially still take just under a decade for BCIs to reach market, but clinical trials are advancing quickly.

What does a brain interface do? ›

A brain computer interface (BCI) is a system that determines functional intent - the desire to change, move, control, or interact with something in your environment - directly from your brain activity. In other words, BCIs allow you to control an application or a device using only your mind.

Who owns Neuralink? ›

Neuralink is a neurotechnology company founded by Elon Musk that's building an implantable, brain-computer interface capable of translating thought into action. Launched in 2016, the private venture claims its neural device will allow people with paraplegia to regain movement and restore vision to those born blind.

What is an example of a brain-computer interface? ›

For example, a person with paralysis could use an implanted BCI that is attached to specific neurons to regain precise control of a limb. Implanted BCIs measure signals directly from the brain, reducing interference from other tissue. However, they pose surgical risks, such as infection and rejection.

Did Stephen Hawking use his eyes to talk? ›

There is a computer chip attached to his glasses. The chip has an ingrared reader that is triggered by blinking. Dr. Hawking can express five to ten words per minute using this technology.

Did Stephen Hawking believe in artificial intelligence? ›

The late Stephen Hawking was a major voice in the debate about how humanity can benefit from artificial intelligence. Hawking made no secret of his fears that thinking machines could one day take charge. He went as far as predicting that future developments in AI “could spell the end of the human race.”

Is it possible to connect your brain to a computer? ›

Its device uses electrocorticography-based BCI (ECoG). Electrodes in the form of metal discs are placed directly on the surface of the brain to receive signals. They connect wirelessly to a receiver, which in turn connects to a computer.

How accurate is the brain-computer interface? ›

Of the studies testing a P300-based BCI for CRS-R assessment (n = 6), classification accuracy above chance ranged from 30.8% to 78.6% showing the ability of BCI to detect abilities not detected on behavioral CRS-R assessment.

Is whole brain emulation possible? ›

The technology for mapping and simulating an entire human brain is not currently available, but it is possible. This means that whole brain emulation likely is not a question of “if” but “when.” Therefore, it is imperative to discuss the ethics and potential ramifications of the technology before it is developed.

Is the brain really like a computer? ›

Perhaps a less misleading term is “computation.” The brain might not be a computer, because it is not literally programmable, and it might not literally run algorithms, but it certainly computes: for example, it can transform sound waves captured at the ears into the spatial position of a sound source.

References

Top Articles
Latest Posts
Article information

Author: Foster Heidenreich CPA

Last Updated:

Views: 6493

Rating: 4.6 / 5 (56 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Foster Heidenreich CPA

Birthday: 1995-01-14

Address: 55021 Usha Garden, North Larisa, DE 19209

Phone: +6812240846623

Job: Corporate Healthcare Strategist

Hobby: Singing, Listening to music, Rafting, LARPing, Gardening, Quilting, Rappelling

Introduction: My name is Foster Heidenreich CPA, I am a delightful, quaint, glorious, quaint, faithful, enchanting, fine person who loves writing and wants to share my knowledge and understanding with you.