Jump to content

AI identifies human emotion based on walking style


Midnightsun

Recommended Posts

Researchers are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles

Image: person walking in the woods
“Forest” by Simon Lehmann (Pixabay)

Researchers at the University of North Carolina at Chapel Hill and the University of Maryland at College Park are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles.

Exploiting gait features to classify emotional state

Through RGB videos of an individual walking, the team extracted his/her walking gait in the form of a series of 3D poses. The aim was to exploit the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. The researchers’ perceived emotion recognition approach is based on using deep features learned via long short-term memory (LSTM) on labeled emotion datasets.

1*hnfJNMWiShA5V2hmbCR9vw.png
A representation of the novel algorithm to identify the perceived emotions of individuals based on their walking styles. Given an RGB video of an individual walking (top), the researchers extracted his/her walking gait as a series of 3D poses (bottom). The used a combination of deep features learned via an LSTM and affective features computed using posture and movement cues and classify using a Random Forest Classifier into basic emotions (e.g., happy, sad, etc.). Credit: Randhavane et al., Fair Use.

Moreover, the team combined these features with affective features computed from the gaits utilizing posture and movement cues. Such features are classified using a Random Forest Classifier (a type of algorithm). The team showed that its mapping between the combined feature space and the perceived emotional state provides 80.07% accuracy in identifying the perceived emotions. In addition to classifying discrete categories of emotions, the algorithm also predicts the values of perceived valence and arousal from gaits.

1*3mh_JWX0ouKD_V80gYGpVQ.png
The visualization of the motion-captured gaits of four individuals with their classified emotion labels. Gait videos from 248 motion-captured gaits were displayed to the participants in a web-based user study to generate labels. The researchers used that data for training and validation. Credit: Randhavane et al., Fair Use.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...