The way people walk says a lot about their mood. Researchers at the University of Chapel Hill and the University of Maryland have taken it a little further and have developed a machine learning method that is capable of recognizing a person’s emotions from his/her walking style or posture.
Different postures and styles of the walk were gathered from videos and then 3D pose estimation technique was used to extract poses and distinguish effective features.
Shoulder posture, the distance between continuous steps and distance between the hands and neck are considered. Sad and happy emotions were analyzed by the angle of head tilt and to distinguish negative and positive emotions, expansion of body and others overall body postures were noted.
Scientists affiliate arousal with increased movements and the same technique is used for machine learning methods. The magnitude of velocity, acceleration and hand, feet and head movements were analyzed by the model. Emotional Walk and EWalk is data set consists of 1384 postures that were recorded inside and outside the university campus from the videos of 24 subjects.
This does not mean it will give 100% results, as it is highly dependent on the pose analyzation and perceiving the postures. The team believes this study would elevate the future researches about the emotion recognition algorithms.