The brain may not be able to speak quite yet, but researchers at the University of California, Berkeley are working to correlate speech to the brain's neural activity.
Using algorithms to interpret the neural activity, they created a graphical representation of the data, which they later converted back to audible speech.
"People listening to the audio replays may be able to pull out coarse similarities between the real word and the constructed words," says Pasley. When New Scientist listened to the words, they could just about make out "Waldo" and "structure". However, its fidelity was sufficient for the team to identify individual words using computer analysis.
Is reading minds just around the bend? Not likely, but researchers hope to help people like Erik Ramsey, who is paralysed.
To read the full story, see
Telepathy machine reconstructs speech from brainwaves by Helen Thomson