log in  |  register  |  feedback?  |  help  |  web accessibility
Can AI figure out your emotion just by looking at your walk?
Friday, December 6, 2019, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)


Emotions play a large role in our lives, defining our experiences, and shaping how we view the world and interact with other humans. Perceiving the emotions of social partners helps us understand their behaviors and decide our actions towards them. Human emotion recognition using intelligent systems is an important socio-behavioral task that arises in various applications, including behavior prediction, surveillance,  robotics, affective computing,  etc. Current research in perceiving human emotion predominantly uses facial cues, speech, or physiological signals such as heartbeats and respiration rates. Research in psychology has shown that body expressions are also crucial in emotion expression and perception and that faces alone were not a diagnostic predictor of valence.

In this talk, I will present research on perceptual emotion identification approach for videos of walking individuals. We classify walking individuals from videos into happy, sad, angry, and neutral emotion categories. These emotions represent emotional states that last for an extended period and are more abundant during walking.

Media: https://www.fastcompany.com/90375885/a-new-ai-can-tell-how-you-feel-just-by-watching-you-walk-down-the-street


Aniket Bera is an Assistant Research Professor in the Department of Computer Science and UMIACS. His core research interests are in Computer Graphics, AI, Social Robotics, Data-Driven Crowd Simulations, Cognitive modeling: Knowledge, reasoning, and planning for intelligent characters.  Bera is working with the Geometric Algorithms for Modeling, Motion, and Animation (GAMMA) group.

Bera’s current research focuses on the social perception of intelligent agents and robotics. His research involves novel combinations of methods and collaborations in the field of social robotics, physically-based simulation, and machine learning to develop real-time computational models to classify such behaviors and validate their performance.

Bera comes to UMD from the University of North Carolina at Chapel Hill (UNC-Chapel Hill), where he was a Research Assistant Professor. 

This talk is organized by Ramani Duraiswami