4.6 Article

Cooperation with autonomous machines through culture and emotion

Journal

PLOS ONE
Volume 14, Issue 11, Pages -

Publisher

PUBLIC LIBRARY SCIENCE
DOI: 10.1371/journal.pone.0224758

Keywords

-

Funding

  1. JSPS KAKENHI [JP16KK0004]
  2. US Army

Ask authors/readers for more resources

As machines that act autonomously on behalf of others-e.g., robots-become integral to society, it is critical we understand the impact on human decision-making. Here we show that people readily engage in social categorization distinguishing humans (us) from machines (them), which leads to reduced cooperation with machines. However, we show that a simple cultural cue-the ethnicity of the machine's virtual face-mitigated this bias for participants from two distinct cultures (Japan and United States). We further show that situational cues of affiliative intent-namely, expressions of emotion-overrode expectations of coalition alliances from social categories: When machines were from a different culture, participants showed the usual bias when competitive emotion was shown (e.g., joy following exploitation); in contrast, participants cooperated just as much with humans as machines that expressed cooperative emotion (e.g., joy following cooperation). These findings reveal a path for increasing cooperation in society through autonomous machines.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available