4.8 Article

Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots

Journal

IEEE INTERNET OF THINGS JOURNAL
Volume 9, Issue 5, Pages 3174-3188

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JIOT.2021.3097592

Keywords

Robots; Robot sensing systems; Face recognition; Faces; Human-robot interaction; Electroencephalography; Internet of Things; Affective robot; brain-robot interface; emotional response; event-related potential (ERP); facial expression; human-robot interaction

Funding

  1. European Commission [CHRIS FP7-215805]
  2. U.K. Dementia Research Institute Care Research and Technology Centre (DRI-CRT)
  3. U.K. EPSRC [EP/F01869X/1]
  4. U.K. Research and Innovation Global Challenges Research Fund
  5. RN Chidakashi Technologies Pvt Ltd.

Ask authors/readers for more resources

This study presents the design and validation of an Internet of Things-enabled social robot that can effectively convey emotions through its hybrid-face expression. The results demonstrate the recognition of robotic expressions by humans and the neurophysiological response to these expressions. The concept of the hybrid-face robot has been implemented and released in a commercial IoT robotic platform, showing comparable results to the original design. The study concludes that simplified hybrid-face abstraction enhances human-robot interaction by effectively conveying emotions.
We present the conceptual formulation, design, fabrication, control, and commercial translation of an Internet of Things (IoT)-enabled social robot as mapped through validation of human emotional response to its affective interactions. The robot design centers on a humanoid hybrid face that integrates a rigid faceplate with a digital display to simplify conveyance of complex facial movements while providing the impression of 3-D depth. We map the emotions of the robot to specific facial feature parameters, characterize recognisability of archetypical facial expressions, and introduce pupil dilation as an additional degree of freedom for emotion conveyance. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to humans. Conveyance is quantified by studying neurophysiological electroencephalography (EEG) response to perceived emotional information as well as through qualitative interviews. The results demonstrate core hybrid-face robotic expressions can be discriminated by humans (80%+recognition) and invoke face-sensitive neurophysiological event-related potentials, such as N170 and vertex positive potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released in the commercial IoT robotic platform Miko (My Companion), an affective robot currently in use for human-robot interaction with children. We demonstrate that human EEG responses to Miko emotions are comparative to that of the hybrid-face robot validating design modifications implemented for large-scale distribution. Finally, interviews show above 90% expression recognition rates in our commercial robot. We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human-robot interaction.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available