Where the world of Metaverse becomes Real

BCI allows paralysed woman to speak and express emotions through avatar

In a groundbreaking development, researchers from UC San Francisco and UC Berkeley have created a revolutionary Brain-Computer Interface (BCI) that enables a severely paralysed woman to communicate through a digital avatar, marking the first-ever synthesis of speech and facial expressions directly from brain signals. This remarkable system converts brain signals into text at an impressive rate of nearly 80 words per minute, far surpassing existing technologies. The BCI decodes brain signals into synthesised speech and facial animations, offering a more natural means of communication for paralysed individuals. Instead of recognising whole words, it identifies phonemes, the sub-units of speech, enhancing both speed and accuracy. The avatar’s voice is personalised to resemble the user’s voice pre-injury, and facial expressions are animated based on the brain’s signals. The researchers’ goal is to develop an FDA-approved system that enables speech from brain signals, potentially revolutionising the lives of individuals with severe paralysis and enhancing their independence and social interactions.

Read more from Neurosciencenews