Jasmine Lopez L&S Social Sciences
Guiding Visual Attention during Contextual Emotion Tracking in Autism
This project aims to enhance emotion perception training for individuals with Autistic traits by developing a guided-gaze assistance program using eye-tracking data from the best 10% of participants in the Inferential Emotion Tracking (IET) task (Chen & Whitney, 2019). The IET task requires observers to infer emotions from contextual cues alone, which has proved to reveal deficits in individuals with Autism Spectrum Disorder (ASD) (Ortega, Chen, & Whitney, 2022). Existing Autism interventions for emotion perception often focus solely on facial expressions, neglecting the complexity of real-life social interactions. By distinguishing parts of the scene where the top 10% of individuals looked during the IET task, we will enable participants to grasp contextual social cues beyond facial expressions, such as body language. To conclude, we will determine how variably performance improves between those with low and high Autism Quotient scores when using the guided-gaze program. These findings will contribute valuable insights into the role of context in emotion perception and inform the development of future technology-based interventions for individuals with Autism Spectrum Disorder.