Why Do People with Autism Struggle to Read Facial Expressions?


By: Inas Essa

Recognition of other people’s facial expressions that uncover what they want to say or what they feel is essential in communication. Because the Autism Spectrum Disorder (ASD) has many symptoms linked to difficulties in interpreting others’ messages correctly, autistic people struggle to read and interpret others’ facial expressions and contact with them effectively. Researchers based at the Tohoku University have uncovered new clues about why people with ASD have this difficulty through Recurrent Neural Networks (RNN) model that reproduces the brain on a computer to study neurons connection involved in the process.

 

Predictive Processing Theory

“Humans recognize different emotions, such as sadness and anger by looking at facial expressions. Yet little is known about how we come to recognize different emotions based on the visual information of facial expressions,” said paper co-author, Yuta Takahashi. “It is also not clear what changes occur in this process that leads to people with autism spectrum disorder struggling to read facial expressions,” she added.

The new study proposed a system-level explanation to understand the facial emotion recognition process and how it is altered in ASD from the perspective of predictive processing theory. According to this theory, the brain is constantly generating and updating a mental model of the environment to predict errors and then adapt when its prediction is wrong. Therefore, sensory information like facial expressions helps in reducing prediction error.

 

 

The Experiment

To investigate how this complex mental process works, the RNN used in this experiment incorporated the predictive processing theory. They were trained to predict the dynamic changes of facial expression videos for basic emotions, without explicit emotion labels, as a developmental learning process. They were evaluated by the performance of recognizing unseen facial expressions for the test phase.

After this process, the group of emotions was self-organized into the model’s higher-level neuron space without the model knowing which emotion the facial expression in the video corresponds to. The model could successfully recognize unknown facial expressions, by reproducing facial part movements and minimizing prediction error. After that, the researchers investigated the effect of abnormalities in the neurons’ activities on learning development and cognitive function by producing abnormalities in the neurons’ activities.

In the model where excessive precision estimation of noisy details was applied, the generalization ability decreased. As the formation of emotional groups in higher-level neurons was restrained, this led to failure in identifying the emotion of unknown facial expressions. This is a similar symptom of ASD; these results support the idea that impaired facial emotion recognition in ASD can be explained by altered predictive processing.

“The study will help advance developing appropriate intervention methods for people who find it difficult to identify emotions,” Takahashi concluded.

 

 

References

nature.com

ibm.com/recurrent-neural-networks