Chinese scientists are developing a methodology to identify and report a person’s emotional state based on the way they are walking, with the intention of incorporating a practical application of the research into a mobile app and a smart bracelet.

Identifying Emotion from Natural Walking [PDF] outlines a study in which 59 young people from the University of Chinese Academy of Sciences (UCAS) at Beijing were fitted with two specially outfitted Samsung Galaxy S2 smartphones, with one Samsung Tab functioning as the Android-based platform, recording accelerometer data at a 5hz sample frequency from a Galaxy attached to both the wrist and the ankle.

To establish baseline data the subjects were required to walk daily in a rectangular area for two minutes, after which they were shown an ‘infuriating’ video for ‘emotion priming’ and then required to walk the space again for a minute.

The subjects reported a notable change in their own emotional state after viewing the videos, and examination of the accelerometer data substantiated a notable change in the gait data being obtained by the platform.

Classification accuracy was divided across various learning method models, including Support Vector Machine (SVM), Decision Tree, Random Forest (RF) and Multilayer Perceptron (MLP), with SVM proving the most accurate.

 Results from wrist and ankle accelerometer inputs as shown in Identifying Emotion from Natural Walking, Liqing Cui, Shun Li, Wan Zhang, Zhan Zhang, Tingshao Zhu, http://arxiv.org/pdf/1508.00413v2.pdf

Results from wrist and ankle accelerometer inputs as shown in Identifying Emotion from Natural Walking, Liqing Cui, Shun Li, Wan Zhang, Zhan Zhang, Tingshao Zhu, http://arxiv.org/pdf/1508.00413v2.pdf

In a second-round experiment the subjects repeated the routine, but this time were given an amusing video to watch, providing the researchers with initial baseline data for the experiment’s ‘anger’ and ‘happy’ classifications of the subjects’ emotional state.

The scientists concluded that the ankle is most revelatory of emotional state, since the wrist is involved in more complicated and (for the purposes of the experiment) irrelevant movements during walking.

The paper concludes that the logical extension of this work has a wide possible range of applications, such as the wireless transmission and analysis of the gait-derived emotional state to health apps and other APIs (presumably including the likes of competing ‘virtual personal assistants’ Cortana, Siri and Google Now, all of which projects are seeking anticipate the needs of end-users based on sensor and data input).

As with similar projects that seek to classify human emotion, such as the Fraunhofer Institutes Google Glass app, which seeks to individuate and classify facial expressions, baseline data and scientific consensus on the validity of the methodology are essential to take the work forward. In the field of psychiatric treatment such a definite determination of emotional state would be valuable in short and long-term studies. Moved over to the field of insurance, medical cover criteria and employment assessment – all of which would doubtless like to know the general state of your spirits in the long term and in a particular situation – it seems to be another Big Data pursuit that would need careful moral and legal oversight if it were ever to leave the test-beds for the real world.