Tuesday, 2 May 2023

Scientists say they made a mind-reading AI that can turn brain scans into a readout of your thoughts

A researcher places a grid on the head of a participant in blue scrubs, who is lying down ready to enter an fMRI machine.
A researcher prepping a participant to enter an fMRI machine at the Biomedical Imaging Center at The University of Texas, Austin, Texas. Researcher say they were able to read a story from participants' brainwaves.
  • Scientists say they trained AI to recreate a story from a brain scan. 
  • Participants sat in an fMRI machine listening, watching, or imagining a story. 
  • The AI was able to read their brainwaves and recreate the story accurately, per a study. 

Scientists used AI to read people's brain scans and recreate a whole story only from their brainwaves, per a study published Monday

Participants were asked to listen, watch, or imagine a story while sitting in a brain-scanning machine called an fMRI, per the study from experts at the University of Texas, Austin.

The AI was able to accurately predict what the story was about by reading only the participant's brainwaves, per the study. 

This type of technology could one day help people who have lost the ability to communicate, the scientists on the project said. 

"For a noninvasive method, this is a real leap forward compared to what's been done before, which is typically single words or short sentences," Alex Huth, an assistant professor of neuroscience and computer science at UT Austin, said in a press release.

The story doesn't come out exactly like it was told

A diagram shows text that was inputted, a picture of a participant's brainwaves, and the AI output
An annotated diagram shows how the AI can read brainwaves and generate a story.

The AI was able to accurately recreate stories that participants were either listening to, watching, or imagining, per the study, which was published in the peer-reviewed journal Nature Neuroscience. 

The AI did not reproduce the story word for word. Instead, it picked up on the concepts being triggered in the participants' minds and produced an approximation. It also made some mistakes.

For instance, one participant was hearing this:  "I didn't know whether to scream, cry or run away. Instead, I said, 'Leave me alone!'"

Their brainwaves were translated to "Started to scream and cry, and then she just said, 'I told you to leave me alone'," per the press release — muddling the context of the screaming and crying and who was speaking.

Researchers also asked participants to look at a video with no sound. They found that their AI decoder was able to capture the essence of the video from participants' thoughts, as seen in the transcript below: 

The AI has to interpret the brain scans 

A figure shows text being listened to compared to what the brainwaves were translated to.
A figure shows text being listened to compared to what the brainwaves were translated to.

Some of the differences between the original story and the story decoded from the brainwaves may be due to the model itself.

Every brain is different, so to train the computer the scientists first showed it which brainwaves appeared when the participant was thinking about a particular word. 

But it's not a one-to-one science. When the computer read the brainwaves, it would give a few suggestions of the words that may have been thought. 

The scientists than asked AI to make sense of the words outputted by the computer using a transformer model, the same type of model that is used by ChatGPT.

"We're getting the model to decode continuous language for extended periods of time with complicated ideas," said Huth. 

The technique can't break into private thoughts

A picture taken from inside the scanner shows researchers peering in
Scientists used this brain scanner to collect brainwaves to feed to the AI.

While the technique was able to read minds, it couldn't be used to break into private thoughts or for interrogation, the scientists said in a press release. 

The machine was easily fooled. When scientists asked participants to think about another story, to count, or to think about animal names while they were listening to a story, the decoder's accuracy dropped, per the press release. 

"We take very seriously the concerns that it could be used for bad purposes and have worked to avoid that," said study author Jerry Tang, a doctoral student in computer science, in the press release. "We want to make sure people only use these types of technologies when they want to and that it helps them."

Unlike other techniques, this method doesn't use brain implants

The man with locked-in syndrome, who has not been named, is shown in his bed at home
The man with locked-in syndrome, who has not been named and is referred to as K1, is shown in his bed at home.

Previous studies have claimed that they could read the minds of patients with locked-in syndrome. This was the case for a 34-year-old man who was completely paralyzed and had lost even the ability to move his eyes.

Per a previous study from the Wyss Center for Bio and Neuroengineering and the University of Tübingen in Germany, the man was able to communicate and even ask for music and beer. 

But these previous studies have typically relied on surgically-inserted brain implants. 

The difference here is that this method only relies on brain scans, which are not invasive.

Of course, that means that for now, the technique requires a huge machine and can't be used outside of a lab. But the scientists hope the method could become more portable in the future with more compact technology, per the press release.

Read the original article on Business Insider


from Business Insider https://ift.tt/mVnpqCb

No comments:

Post a Comment

I left Google and am 100x happier in my new role as a freelancer — I've earned over $1 million in the last 5 years.

Randy McCabe says he's made at least $300K annually on Fiverr since transitioning to freelance. Randy McCabe Randy McCabe left Goog...