Call it Shazam for brains. Researchers claim in a new study that they were able to use artificial intelligence to identify the songs people were listening to by reading their brainwaves.
Artificial intelligence research decodes brain activity in dialogue
Scientists used machines to monitor brain signals and then computer algorithms to determine which song was being played. The study is the latest in a growing number of projects to decode human brain waves using computers. Efforts to interpret brain waves are nearing completion, experts say.
“Are we able to decode neural representations in a way that is of practical value to people?” Harvard neurology researcher Richard Hakim said in a telephone interview. “The answer is, we kind of are.”
In a recent study, Derek Lomas of Delft University of Technology and his colleagues asked 20 people to listen to 12 songs through headphones.