Angela Lee L&S Social Sciences

EQVision: Affective Tracking of Multiple Characters in Context

This research aims to develop a computer vision model to continuously track emotions of multiple characters within videos, taking into consideration facial expressions, body language, and contextual information in real-time. Aiming to fill the gap between current models and human analysis of emotions, this project will explore various image segmentation techniques for the model to best learn how to identify and differentiate between characters, such as the superpixel feature pooling method that splits images according to similar features (Ren et al., 2024). In addition, the model will factor in environmental cues, as research shows the significance of context in perceiving emotion, allowing the model to accurately identify emotions even if the central character is no longer in frame (Chen and Whitney, 2019). To track data continuously, a time encoder will enable the model to learn temporal connections between frames. By developing a model for continuous and context-integrated emotion analysis, my goal is to bring computer vision one step closer to understanding and capturing the complexities of human behavior and emotion perception.

Message To Sponsor

Thank you so much for your support and making SURF possible. I am so excited to be working on my own independent research project, and incredibly grateful for the opportunity to pursue research under the guidance of faculty. Thank you for making this experience a reality and helping me pursue my goals of working in academia, working on research that will help others and make a positive impact.
Headshot of Angela Lee
Major: Psychology, Computer Science
Mentor: David Whitney
Sponsor: Leadership
Back to Listings
Back to Donor Reports