Pauline Trung
Head and shoulders: Automatic error detection in human-robot interaction
Trung, Pauline; Giuliani, Manuel; Miksch, Michael; Stollnberger, Gerald; Stadler, Susanne; Mirnig, Nicole; Tscheligi, Manfred
Authors
Manuel Giuliani Manuel.Giuliani@uwe.ac.uk
Co- Director Bristol Robotics Laboratory
Michael Miksch
Gerald Stollnberger
Susanne Stadler
Nicole Mirnig
Manfred Tscheligi
Abstract
We describe a novel method for automatic detection of errors in human-robot interactions. Our approach is to detect errors based on the classification of head and shoulder movements of humans who are interacting with erroneous robots. We conducted a user study in which participants interacted with a robot that we programmed to make two types of errors: social norm violations and technical failures. During the interaction, we recorded the behavior of the participants with a Kinect v1 RGB-D camera. Overall, we recorded a data corpus of 237,998 frames at 25 frames per second; 83.48% frames showed no error situation; 16.52% showed an error situation. Furthermore, we computed six different feature sets to represent the movements of the participants and temporal aspects of their movements. Using this data we trained a rule learner, a Naive Bayes classifier, and a k-nearest neighbor classifier and evaluated the classifiers with 10-fold cross validation and leave-one-out cross validation. The results of this evaluation suggest the following: (1) The detection of an error situation works well, when the robot has seen the human before; (2) Rule learner and k-nearest neighbor classifiers work well for automated error detection when the robot is interacting with a known human; (3) For unknown humans, the Naive Bayes classifier performed the best; (4) The classification of social norm violations does perform the worst; (5) There was no big performance difference between using the original data and normalized feature sets that represent the relative position of the participants.
Presentation Conference Type | Conference Paper (unpublished) |
---|---|
Conference Name | 19th ACM International Conference on Multimodal Interaction |
Start Date | Nov 13, 2017 |
End Date | Nov 17, 2017 |
Acceptance Date | Nov 14, 2017 |
Publication Date | Nov 14, 2017 |
Deposit Date | Nov 8, 2017 |
Journal | Proceedings of International Conference on Multimodal Interaction |
Peer Reviewed | Peer Reviewed |
ISBN | 9781450355438 |
Keywords | human-robot interaction, error detection, error situation, human activity recognition, RBG-D camera, faulty robot |
Public URL | https://uwe-repository.worktribe.com/output/878511 |
Related Public URLs | https://icmi.acm.org/2017/ |
Additional Information | Title of Conference or Conference Proceedings : 19th ACM International Conference on Multimodal Interaction |
Contract Date | Nov 8, 2017 |
You might also like
An RGB-D based social behavior interpretation system for a humanoid social robot
(2014)
Presentation / Conference Contribution
Ghost-in-the-Machine reveals human social signals for human-robot interaction
(2015)
Journal Article
Multi-modality gesture detection and recognition with un-supervision, randomization and discrimination
(2015)
Presentation / Conference Contribution
Action recognition using ensemble weighted multi-instance learning
(2014)
Presentation / Conference Contribution
Designing and evaluating a social gaze-control system for a humanoid robot
(2014)
Journal Article
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search