4.4 Article

Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces

Journal

INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS
Volume 6, Issue 3, Pages 367-381

Publisher

SPRINGER
DOI: 10.1007/s12369-014-0237-z

Keywords

Human-robot interaction; Facial expression; Emotion; Minimalist design; Robot design

Categories

Funding

  1. Indiana University's School of Informatics and Computing

Ask authors/readers for more resources

This study explores deriving minimal features for a robotic face to convey information (via facial expressions) that people can perceive and understand. Recent research in computer vision has shown that a small number of moving points/lines can be used to capture the majority of information (95 %) in human facial expressions. Here, we apply such findings to a minimalist robot face design, which was run through a series of experiments with human subjects (n = 75) exploring the effect of various factors, including added neck motion and degree of expression. Facial expression identification rates were similar to more complex robots. In addition, added neck motion significantly improved facial expression identification rates to 100 % for all expressions (except Fear). The Negative Attitudes towards Robots (NARS) and Godspeed scales were also collected to examine user perceptions, e.g. perceived animacy and intelligence. The project aims to answer a number of fundamental questions about robotic face design, as well as to develop inexpensive and replicable robotic faces for experimental purposes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available