4.5 Article

People are averse to machines making moral decisions

期刊

COGNITION
卷 181, 期 -, 页码 21-34

出版社

ELSEVIER
DOI: 10.1016/j.cognition.2018.08.003

关键词

Mind perception; Morality; Moral agency; Autonomous machines; Skynet; Robots

资金

  1. National Science Foundation SBE Postdoctoral Research Fellowship [1714298]
  2. Charles Koch Foundation
  3. Direct For Social, Behav & Economic Scie
  4. SBE Off Of Multidisciplinary Activities [1714298] Funding Source: National Science Foundation

向作者/读者索取更多资源

Do people want autonomous machines making moral decisions? Nine studies suggest that that the answer is 'no'-in part because machines lack a complete mind. Studies 1-6 find that people are averse to machines making morally-relevant driving, legal, medical, and military decisions, and that this aversion is mediated by the perception that machines can neither fully think nor feel. Studies 5-6 find that this aversion exists even when moral decisions have positive outcomes. Studies 7-9 briefly investigate three potential routes to increasing the acceptability of machine moral decision-making: limiting the machine to an advisory role (Study 7), increasing machines' perceived experience (Study 8), and increasing machines' perceived expertise (Study 9). Although some of these routes show promise, the aversion to machine moral decision-making is difficult to eliminate. This aversion may prove challenging for the integration of autonomous technology in moral domains including medicine, the law, the military, and self-driving vehicles.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据