A new AI system that converts brain activity into text could transform communication for people who can’t speak or type.
The mind-reading machine uses brain implants to track what neurons do when someone speaks.
An algorithm then converts the data into a string of numbers, which another system then translates into a sequence of words.
For one participant, just 3% of sentences needed to be corrected.
This made the AI more accurate than professional human transcribers, whose error rates have been recorded at 5%.
The system was developed by researchers from the University of California, San Francisco, who revealed their findings in the journal Nature Neuroscience.
Turning brain activity into text
The researchers tested their system on four people with brain implants that monitor their epileptic seizures.
The participants were asked to read a range of simple sentences aloud, including “Tina Turner is a pop singer” and “the oasis was a mirage.”
As they spoke, their brain signals were fed into a computer and decoded into words.
The system sometimes struggled to convert the words into coherent sentences. For example, it translated “those musicians harmonize marvelously” into “the spinach was a famous singer.”
But the AI was still more accurate than other speech transcription systems.
The researchers stressed that the tech currently only works when someone is speaking aloud.
However, it could be upgraded to translate the thoughts of people who can’t communicate verbally, such as those with locked-in syndrome, a neurological disorder that causes paralysis.