One common symptom that people with autism struggle with is the inability to interpret facial expressions. It can cause difficulty understanding social cues, whether in the home, at school or in work. Researchers at MIT created an AI to explain why this happens.
A paper published on Wednesday in The Journal of Neuroscience unveiled research that found that neurotypical adults (those not displaying autistic characteristics) and adults with autism might have key differences in a region of their brain called the IT cortex. These differences could determine whether or not they can detect emotions via facial expressions.
“For visual behaviors, the study suggests that [the IT cortex] plays a strong role,” Kohitij Kar, a neuroscientist at MIT and author of the study, told The Daily Beast. It might not be the only one. Others regions, such as the amygdala, have also been strongly implicated. But these studies illustrate how having good [AI models] of the brain will be key to identifying those regions as well.”
Kar’s neural network actually draws on a previous experiment conducted by other researchers. In that study, AI-generated pictures of faces that displayed different emotions ranging from fearful to happy were shown to autistic adults and neurotypical adults. The volunteers judged whether the faces were happy–with the autistic adults requiring a much clearer indication of happiness (e.g. When compared with the neurotypical participant, they reported them as having bigger smiles.
Kar then fed the data from that experiment into an AI developed to approximately mimic the layers of the human brain’s visual processing system. He found out that his neural network could recognize facial emotions as much as neurotypical participants. He then removed the layers from the neural network and tested it again until he reached the final layer. Past research has suggested that this is similar to the IT cortex. He found the AI was unable to perform well against neurotypical adults and mimicked autistic children more.
This suggests that this part of the brain, which sits near the end of the visual processing pipeline, could be responsible for facial recognition. This study could lay the groundwork for a better way to diagnose autism. Kar adds that it might help in the development of engaging media and educational tools for autistic children as well.
“Autistic kids sometimes rely heavily on visual cues for learning and instructions,” Kar explained. “Having an accurate model where you can feed in images, and the models tell you, ‘This will work best, and this won’t’ can be very useful for that purpose. Any visual content like movies, cartoons, and educational content can be optimized using such models to maximally communicate with, benefit and nurture autistic individuals.”
The post MIT AI Bot Discovers Why Some Autistic Adults Can’t Detect Emotion appeared first on The Daily Beast.