Meta built an AI that can guess the words you’re hearing
Decoding brainwaves may help people who are unable to speak communicate again. It could also provide new ways for humans and computers to interact. Researchers at Meta have demonstrated that they can now tell the words someone is listening to using recordings taken from non-invasive scans of the brain.
Scientists have improved our ability to monitor brain activity in the last few decades. They developed various brain-computer interface technologies (BCI), which can give us a glimpse into what we are thinking and feeling.
AI systems that are able to learn how to interpret brain signals have produced the most impressive results. This has allowed for 97 percent accuracy in decoding sentences from neural activity and translating handwriting into text.