AI Trained To Read People's Minds Using Brain Scans
Science/Medical/Technology
Friday 14th, April 2023
A team of Japanese researchers have been able to train an AI to understand what a person is seeing by the AI reading the persons brain activity.
Researchers at Osaka University showed four volunteer subjects images, a series of 10,000 images while the individuals brains were connected to an fMRI machine.
The machine allowed researchers to map small changes in blood flow of the individual's brains to see what areas "light up" in response to seeing the different images.
The process was done an additional two more times to allow the data to be mapped by the AI to compare a subjects thoughts. Once the researchers were happy that the AI was trained enough the subjects were shown an image of an object or scene that was not in the original 10,000 images while the individual was connected to the fMRI machine.
The outcomes was shockingly accurate, the AI was able to read the brain scan of the individual and recreate an incredibly accurate image as to what the person was view.
Yu Takagi, one of the team researchers said,
The results of the teams research brings to mind George Orwell's '1984' and the 'thought police'.
With the results being so accurate at this early stage, how much further are the researchers willing to go and will the researcher be closly watched by government agencies wanted to be able to know what their citizens are thinking?!
From a law perspective the uses are obvious, you could catch a person before they have committed a crime, but where do we draw the line when someone thinks something but would never carry out that action, in the eyes of any future law enforcement the crime has already been committed just by thinking it....Minority Report.
With the drive to see what AI can be used for and everyone wanting to be the first to do amazing things, it could be a matter of time until someone push the AI just that bit to far and maybe they wont be able to undo or stop what has been created.
Researchers at Osaka University showed four volunteer subjects images, a series of 10,000 images while the individuals brains were connected to an fMRI machine.
The machine allowed researchers to map small changes in blood flow of the individual's brains to see what areas "light up" in response to seeing the different images.
The process was done an additional two more times to allow the data to be mapped by the AI to compare a subjects thoughts. Once the researchers were happy that the AI was trained enough the subjects were shown an image of an object or scene that was not in the original 10,000 images while the individual was connected to the fMRI machine.
The outcomes was shockingly accurate, the AI was able to read the brain scan of the individual and recreate an incredibly accurate image as to what the person was view.
Yu Takagi, one of the team researchers said,
"The most interesting part of our research is that the diffusion model—so-called image-generating AI which [...] was not created to understand the brain—predicts brain activity well and can be used to reconstruct visual experiences from the brain,"
"When we see things, visual information captured by the retina is processed in a brain region called the visual cortex located in the occipital lobe,"
"When we imagine an image, similar brain regions are activated. It is [therefore] possible to apply our technique to brain activity during imagination."
The results of the teams research brings to mind George Orwell's '1984' and the 'thought police'.
With the results being so accurate at this early stage, how much further are the researchers willing to go and will the researcher be closly watched by government agencies wanted to be able to know what their citizens are thinking?!
From a law perspective the uses are obvious, you could catch a person before they have committed a crime, but where do we draw the line when someone thinks something but would never carry out that action, in the eyes of any future law enforcement the crime has already been committed just by thinking it....Minority Report.
With the drive to see what AI can be used for and everyone wanting to be the first to do amazing things, it could be a matter of time until someone push the AI just that bit to far and maybe they wont be able to undo or stop what has been created.