Brain implant helps paralyzed patient communicate by translating thoughts into full sentences

As brain-computer interface technology rapidly progresses, we’re closer than ever to restoring speech, movement and other daily activities in people paralyzed by stroke, disease or physical injury—relying solely on their own brain signals.

Perhaps the closest yet to restoring natural communication patterns is a team of researchers from the University of California San Francisco who have developed a “neuroprosthesis” that, once trained on an individual patient’s brain patterns, can eventually translate their thoughts into speech.

A study of their system—funded by Facebook and the National Institutes of Health and published in the New England Journal of Medicine—found it was able to decode entire sentences at once with a median rate of about 15 words per minute and an average 75% accuracy, with a word error rate of just over 25%.

RELATED: FDA finalizes 'leapfrog' guidance on use of thought-controlled tech in paralysis patients, amputees

To get to that level, the researchers spent over a year-and-a-half training the system to recognize the brain signals of a single paralyzed patient, a man who lost his ability to communicate and much of his movement following a stroke in his brain stem more than 15 years ago.

After a high-density electrode array was implanted over his speech motor cortex, the patient was tasked with repeatedly attempting to say 50 words on a list of basic terms like water, family and good. Those attempts took place over the course of 48 sessions, totaling 22 hours of training.

From there, deep learning algorithms analyzed the recordings of the training sessions to pick up on patterns in the patient’s brain signals as he attempted to say each word.

The resulting neural network was combined with a natural-language model able to predict probable next words in a sentence allowing the system to more quickly and accurately produce entire sentences at a time and “auto-correct” any obvious errors along the way.

RELATED: FDA approves wireless brace that uses brainwaves to improve hand function in stroke patients

That’s a major leap forward from most other systems allowing nonverbal patients to communicate, which typically operate on a letter-by-letter basis.

The core difference between the two methods stems from the neuroprosthesis’ ability to pick up on brain signals sent to the vocal tract, rather than those sent to the arm or hand to type out a word or control a cursor.

“To our knowledge, this is the first successful demonstration of direct decoding of full words from the brain activity of someone who is paralyzed and cannot speak,” said Edward Chang, a UCSF neurosurgeon and the lead author on the study. “Going straight to words, as we’re doing here, has great advantages because it’s closer to how we normally speak.”

Moving forward, bolstered by the success of their first attempt, the researchers said they’ll expand the study to include other severely paralyzed participants. They’re also hoping to expand the vocabulary list and up the system’s word-per-minute rate.

RELATED: Brain-computer interface allowing 'locked-in' ALS patients to communicate earns European approval

The race is on to develop a system restoring communication to those unable to speak. Just last month, for example, a device designed to translate the thoughts of nonverbal patients with ALS into speech was granted a CE mark, clearing it for use in Europe.

The NeuroKey, developed by the Swiss nonprofit Wyss Center for Bio and Neuroengineering, uses electrodes implanted on the brain to capture neurological signals and send them to a computer. There, an algorithm parses out which letter—or which option on a basic yes/no prompt—the patient intended to choose, allowing them to slowly but surely piece together words and sentences.

The platform is designed to be easily adaptable to any nonverbal ALS patient, with family members and caregivers able to calibrate it themselves for home use. NeuroKey’s developers are also currently working on integrating the system with the Wyss Center’s ABILITY wearable device to eventually enable physical movement using only the brain signals of paralyzed users.