An AI can read words in brain signals

Researchers at UCSF have trained an algorithm to parse meaning from neural activity.

Image source: ESB Professional/Shutterstock/Big Think
  • Participants' neural activity is collected as they speak 50 sentences.
  • A machine-learning algorithm develops a prediction of what the collected data means.
  • The system's accuracy varies, but the results are promising.

    • It's just a start, but it's pretty exciting: a system that translates brain activity into text. For those unable to physically speak, such as people with locked-in syndrome for example, this would be a life-changer.

      Right now, it's a bit like seeing through a heavy fog, but researchers at the Chang Lab at the University of California at San Francisco have trained a machine-learning algorithm to extract meaning from neuronal data.

      Joseph Makin, co-author of this research tells The Guardian, "We are not there yet, but we think this could be the basis of a speech prosthesis."

      The research is published in the journal Nature Neuroscience.

      Eavesdropping

      Image source: Teeradej/Shutterstock

      To train their AI, Makin and co-author Edward F. Chang "listened in" on the neural activity of four participants. As epileptics, each participant had had brain electrodes implanted for the purpose of seizure monitoring.

      The participants were supplied 50 sentences they were to read aloud at least three times. As they did, neural data was collected by the researchers. (Audio recordings were also made.)

      The study lists a handful of the sentences the participants recited, among them:

      • "Those musicians harmonize marvelously."
      • "She wore warm fleecy woolen overalls."
      • "Those thieves stole thirty jewels."
      • "There is chaos in the kitchen."

      The algorithm's task was to analyze the collected neural data and make predictions as to what was being said when the data was generated. (Data associated with non-verbal sounds captured in the participants' audio recording was factored out first.)

      The researchers' algorithm learned pretty quickly to predict the words associated with chunks of neural data. The AI predicted the data generated when "A little bird is watching the commotion" was spoken would mean "The little bird is watching watching the commotion," quite close, while "The ladder was used to rescue the cat and the man" was predicted as, "Which ladder will be used to rescue the cat and the man."

      The accuracy varied form participant to participant. Makin and Chang found that an algorithm based on one participant had a head start on being trained for another, suggesting that training the AI could get easier over time and repeated use.

      The Guardian spoke with expert Christian Herff, who found the system impressive for using less than 40 minutes of training data for each participant rather than the far greater amount of time required by other attempts to derive text from neural data. He says, "By doing so they achieve levels of accuracy that haven't been achieved so far."

      Previous attempts to derive speech from neural activity focused on the phonemes from which spoken words are built, but Makin and Chang focused on the overall words instead. While there are certainly more words than phonemes, and thus this poses a greater challenge, the study says, "the production of any particular phoneme in continuous speech is strongly influenced by the phonemes preceding it, which decreases its distinguishability." To minimize the difficulty of their word-based approach, the spoken sentences used a total of just 250 words.

      Through the neural fog

      Image source: whitehoune/Shutterstock/Big Think

      Clearly, though, there's room for improvement. The AI also predicted that "Those musicians harmonize marvelously" was "The spinach was a famous singer." "She wore warm fleecy woolen overalls" was mis-predicted as "The oasis was a mirage." "Those thieves stole thirty jewels" was misconstrued as "Which theatre shows mother goose," while the algorithm predicted the data for "There is chaos in the kitchen" meant "There is is helping him steal a cookie."

      Of course, the vocabulary involved in this research is limited, as are the sentence exemplars. "If you try to go outside the [50 sentences used] the decoding gets much worse," notes Makin, citing the limitations of his study. Another obvious caveat comes from the fact that the AI was trained from sentences spoken aloud by each participant, an impossibility with locked-in patients.

      Still, the research by Makin and Chang is encouraging. Predictions for one of their participants required just a tiny 3% correction. That's actually better than the 5% error rate found in human transcriptions.

      The world and workforce need wisdom. Why don’t universities teach it?

      Universities claim to prepare students for the world. How many actually do it?

      Photo: Take A Pix Media / Getty Images
      Sponsored by Charles Koch Foundation
      • Many university mission statements do not live up to their promise, writes Ben Nelson, founder of Minerva, a university designed to develop intellect over content memorization.
      • The core competencies that students need for success—critical thinking, communication, problem solving, and cross-cultural understanding, for example—should be intentionally taught, not left to chance.
      • These competencies can be summed up with one word: wisdom. True wisdom is the ability to apply one's knowledge appropriately when faced with novel situations.
      Keep reading Show less

      What the world will look like in the year 250,002,018

      This is what the world will look like, 250 million years from now

      On Pangaea Proxima, Lagos will be north of New York, and Cape Town close to Mexico City
      Surprising Science

      To us humans, the shape and location of oceans and continents seems fixed. But that's only because our lives are so short.

      Keep reading Show less

      Six-month-olds recognize (and like) when they’re being imitated

      A new study may help us better understand how children build social cognition through caregiver interaction.

      Personal Growth
    • Scientists speculate imitation helps develop social cognition in babies.
    • A new study out of Lund University shows that six-month-olds look and smile more at imitating adults.
    • Researchers hope the data will spur future studies to discover what role caregiver imitation plays in social cognition development.
    • Keep reading Show less

      New study connects cardiovascular exercise with improved memory

      Researchers at UT Southwestern noted a 47 percent increase in blood flow to regions associated with memory.

      An elderly man runs during his morning exercises at the promenade on the Bund along the Huangpu Rive the Bund in Shanghai on May 18, 2017.

      Photo: Johannes Eisele/AFP via Getty Images
      Surprising Science
      • Researchers at UT Southwestern observed a stark improvement in memory after cardiovascular exercise.
      • The year-long study included 30 seniors who all had some form of memory impairment.
      • The group of seniors that only stretched for a year did not fair as well in memory tests.
      Keep reading Show less
      Scroll down to load more…