Scientists Invent A Mind-Reading Machine That Can Read Yourย Thought Process
A team from the University of Oregon have developed a system that can read peopleโs thoughts via brain scans, and rebuild the faces they were visualising in their heads.
The study, led by Brice Kuhl and Hongmi Lee from the University of Oregon, used artificial intelligence (AI) that analysed brain activity in an attempt to reconstruct one of a series of faces that participants were seeing. Itโs not an exact science, but the AI did get close.
“We can take someoneโs memory – which is typically something internal and private – and we can pull it out from their brains,” Kuhl told Vox.
โSome people use different definitions of mind reading, but certainly, thatโs getting close,โ Kuhl told Vox.
Kuhl and his colleague Lee recently published a paper in The Journal of Neuroscience with a conclusion straight out of science fiction: Kuhl and Lee created images directly from memories using an MRI, some machine learning software, and a few hapless human guinea pigs.
Kuhl’s method involves putting test participants in an MRI who are then shown several hundred faces. The program had access to real time MRI data from the machine, as well as a set of 300 numbers that described each face. They covered everything from skin tone to eye position.
As an MRI detects the movement of blood around the brain, the assumed conclusion was that equals brain activity – so the program analysed these movements of blood in reaction to these different points to learn how a particular brain reacts to known stimuli.
The AI was put to the test with a few hundred examples incorporated into its algorithm. The participants were again shown a face, but this time the program didnโt know anything about the numbers describing it. The only thing it had to go on was the MRI data that described brain activity as the person saw the face.
The AI learns to match blood flow over time, which explains the brain activity with facial features seen by the subjects in the MRI machine, spluttering out rough images of what the subjects are seeing.
The resulting images have a dream-like quality, with blurred edges and shifts in skin tone. The images look like the kind of vague renderings a person might see in their own brain. But the machine has the ability for getting the facial expressions right, and other features — an upturned nose, an arched eyebrow — seem to translate faithfully from mind to machine.
To provide some specific numbers data on the programโs success rate, Kuhl and Lee showed the strange reconstructions to another set of participants and asked them questions relating to skin tone, gender and emotion. As you would have thought, the group responded correctly that described the original faces at a higher rate than random chance, which proves the AI interpretations provide relatively accurate data.
“[The researchers] showed these reconstructed images to a separate group of online survey respondents and asked simple questions like, ‘Is this male or female?’ ‘Is this person happy or sad?’ and ‘Is their skin colour light or dark?โ
To a degree greater than chance, the responses checked out. These basic details of the faces can be gleaned from mind reading.”
Kuhl told Vox that the image fidelity can be improved with more sessions, as the AI collects more data and gains more experience in matching brain activity to the details of images seen by test subjects.
The team is now working on an even tougher task of getting their participants to see a face, hold it in their memory, and then get the AI to reconstruct it based on the personโs memory of what the face looked like.
โI donโt want to put a cap on it,โ Kuhl said. โWe can do better.โ
He also made it clear that his machines can’t read people’s minds without their cooperation.
โYou need someone to play ball,โ Kuhl said. โYou canโt extract someoneโs memory if they are not remembering it, and people most of the time are in control of their memories.โ
The study has been published inย The Journal of Neuroscience.