3.8 C
New York

Neuroscientists Decode Brain Speech Signals Into Written Text

Doctors have turned the brain signals for speech into written sentences in a research project that aims to transform how patients with severe disabilities communicate in the future.

Study Recording Brain Signals

The breakthrough is the first to demonstrate how the intention to say specific words can be extracted from brain activity and converted into text rapidly enough to keep pace with natural conversation.

In its current form, the brain-reading software works only for certain sentences it has been trained on, but scientists believe it is a stepping stone towards a more powerful system that can decode in real time the words a person intends to say.

“To date there is no speech prosthetic system that allows users to have interactions on the rapid timescale of a human conversation,” said Edward Chang, a neurosurgeon and lead researcher on the study published in the journal Nature.

The work, funded by Facebook, was possible thanks to three epilepsy patients who were about to have neurosurgery for their condition. Before their operations went ahead, all three had a small patch of tiny electrodes placed directly on the brain for at least a week to map the origins of their seizures.

During their stay in hospital, the patients, all of whom could speak normally, agreed to take part in Chang’s research. He used the electrodes to record brain activity while each patient was asked nine set questions and asked to read a list of 24 potential responses.

Despite the breakthrough, there are hurdles ahead. One challenge is to improve the software so it can translate brain signals into more varied speech on the fly. This will require algorithms trained on a huge amount of spoken language and corresponding brain signal data, which may vary from patient to patient.

While the work is still in its infancy, Winston Chiong, a neuroethicist at UCSF who was not involved in the latest study, said it was important to debate the ethical issues such systems might raise in the future. For example, could a “speech neuroprosthesis” unintentionally reveal people’s most private thoughts?

Chang said that decoding what someone was openly trying to say was hard enough, and that extracting their inner thoughts was virtually impossible. “I have no interest in developing a technology to find out what people are thinking, even if it were possible,” he said.

“But if someone wants to communicate and can’t, I think we have a responsibility as scientists and clinicians to restore that most fundamental human ability.” It’s the same concept as AI text to speech, but thus time, it’s direct to the brain.

Latest news
Related news