HRI-Gestures: Gesture Recognition for Human-Robot Interaction

Avgi Kollakidou, Frederik Haarslev, Cagatay Odabasi, Leon Bodenhagen, Norbert Krüger

Publikation: Kapitel i bog/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

32 Downloads (Pure)

Abstrakt

Most of people’s communication happens through body language and gestures. Gesture recognition in human-robot interaction is an unsolved problem which limits the possible communication between humans and robots in today’s applications. Gesture recognition can be considered as the same problem as action recognition which is largely solved by deep learning, however, current publicly available datasets do not contain many classes relevant to human-robot interaction. In order to address the problem, a human-robot interaction gesture dataset is therefore required. In this paper, we introduce HRI-Gestures, which includes 13600 instances of RGB and depth image sequences, and joint position files. A state of the art action recognition network is trained on relevant subsets of the dataset and achieve upwards of 96.9% accuracy. However, as the network is designed for the large-scale NTU RGB+D dataset, subpar performance is achieved on the full HRI-Gestures dataset. Further enhancement of gesture recognition is possible by tailored algorithms or extension of the dataset.
OriginalsprogEngelsk
Titel Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 5 VISAPP: VISAPP
Publikationsdato2022
ISBN (Elektronisk)978-989-758-555-5
DOI
StatusUdgivet - 2022

Fingeraftryk

Dyk ned i forskningsemnerne om 'HRI-Gestures: Gesture Recognition for Human-Robot Interaction'. Sammen danner de et unikt fingeraftryk.

Citationsformater