Emotion Recognition Through Body Language Using RGB-D Sensor

Lilita Kiforenko, Dirk Kraft

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


This paper presents results on automatic non-acted human emotion recognition using full standing body movements and postures. The focus of this paper is to show that it is possible to classify emotions using a consumer depth sensor in an everyday scenario. The features for classification are body joint rotation angles and meta-features that are fed into a Support Vector Machines classifier. The work of Gaber-Barron and Si (2012) is used as inspiration and many of their proposed meta-features are reimplemented or modified. In this work we try to identify ”basic” human emotions, that are triggered by various visual stimuli. We present the emotion dataset that is recorded using Microsoft Kinect for Windows sensor and body joints rotation angles that are extracted using Microsoft Kinect Software Development Kit 1.6. The classified emotions are curiosity, confusion, joy, boredom and disgust. We show that human real emotions can be classified using body movements and postures with a classif ication accuracy of 55.62%.
Original languageEnglish
Title of host publicationProceedings of the 11th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications
EditorsNadia Magnenat-Thalmann, Paul Richard, Lars Linsen, Alexandru Telea, Sebastiano Battiato, Francisco Imai, José Braz
PublisherSCITEPRESS Digital Library
Publication date2016
ISBN (Electronic)978-989-758-175-5
Publication statusPublished - 2016
Event11th International Conference on Computer Vision Theory and Applications - Rom, Italy
Duration: 27. Feb 201629. Feb 2016


Conference11th International Conference on Computer Vision Theory and Applications


Dive into the research topics of 'Emotion Recognition Through Body Language Using RGB-D Sensor'. Together they form a unique fingerprint.

Cite this