Translating knowledge from primate neuroscience and artificial intelligence to develop non-invasive behavioral modulation strategies during emotion recognition in autistic adults
Project Overview
Recognizing emotions and intentions from facial expressions is crucial for human communication and social interaction. While most adults can easily do this, individuals with Autism Spectrum Disorder (ASD) often face challenges in this area. Previous research suggests that differences in how the brain processes sensory information may contribute to these behavioral differences in autistic individuals. To better understand this phenomenon, we propose three key innovative research directions. We plan to study the macaque monkey’s brain in detail to understand how it recognizes facial expressions (Aim 1). We will use this information to improve computer models that mimic primate vision, making them more similar to the brains of both neurotypical people and autistic individuals (Aim 2). Finally, we’ll use the best computer models to create images that help reduce the differences in behavior between neurotypical adults and those with autism when recognizing facial expressions (Aim 3). In summary, this research aims to deepen our understanding of ASD and the neural mechanisms contributing to facial emotion recognition differences. By using computer-based models, neuroscience knowledge, and artificial neural networks, we hope to uncover new insights that could improve the lives of those living with ASD.
Principal Investigator
Kohitij Kar , York University
Partners and Donors
Azrieli Foundation