Skip to main content

Research Repository

Advanced Search

Emotive response to a hybrid-face robot and translation to consumer social robots

Wairagkar, Maitreyee; Lima, Maria R.; Bazo, Daniel; Craig, Richard; Weissbart, Hugo; Etoundi, Appolinaire C.; Reichenbach, Tobias; Iyengar, Prashant; Vaswani, Sneh; James, Christopher; Barnaghi, Payam; Melhuish, Chris; Vaidyanathan, Ravi

Emotive response to a hybrid-face robot and translation to consumer social robots Thumbnail


Authors

Maitreyee Wairagkar

Maria R. Lima

Daniel Bazo

Richard Craig

Hugo Weissbart

Tobias Reichenbach

Prashant Iyengar

Sneh Vaswani

Christopher James

Payam Barnaghi

Chris Melhuish Chris.Melhuish@uwe.ac.uk
Professor of Robotics & Autonomous Systems

Ravi Vaidyanathan



Abstract

We present the conceptual formulation, design, fabrication, control, and commercial translation of an Internet of Things (IoT)-enabled social robot as mapped through validation of human emotional response to its affective interactions. The robot design centers on a humanoid hybrid face that integrates a rigid faceplate with a digital display to simplify conveyance of complex facial movements while providing the impression of 3-D depth. We map the emotions of the robot to specific facial feature parameters, characterize recognisability of archetypical facial expressions, and introduce pupil dilation as an additional degree of freedom for emotion conveyance. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to humans. Conveyance is quantified by studying neurophysiological electroencephalography (EEG) response to perceived emotional information as well as through qualitative interviews. The results demonstrate core hybrid-face robotic expressions can be discriminated by humans (80%+recognition) and invoke face-sensitive neurophysiological event-related potentials, such as N170 and vertex positive potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released in the commercial IoT robotic platform Miko (“My Companion”), an affective robot currently in use for human–robot interaction with children. We demonstrate that human EEG responses to Miko emotions are comparative to that of the hybrid-face robot validating design modifications implemented for large-scale distribution. Finally, interviews show above 90% expression recognition rates in our commercial robot. We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human–robot interaction.

Journal Article Type Article
Acceptance Date Jul 15, 2021
Online Publication Date Jul 15, 2021
Publication Date Mar 1, 2022
Deposit Date Feb 7, 2025
Publicly Available Date Feb 12, 2025
Journal IEEE Internet of Things Journal
Print ISSN 2327-4662
Publisher Institute of Electrical and Electronics Engineers
Peer Reviewed Peer Reviewed
Volume 9
Issue 5
Pages 3174-3188
DOI https://doi.org/10.1109/jiot.2021.3097592
Public URL https://uwe-repository.worktribe.com/output/13734626

Files





You might also like



Downloadable Citations