4.5 Article

Deliberately prejudiced self-driving vehicles elicit the most outrage

Journal

COGNITION
Volume 208, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.cognition.2020.104555

Keywords

Moral judgment; Autonomous vehicles; Driverless policy; Moral outrage

Ask authors/readers for more resources

When it comes to programming and policy-making for autonomous vehicles, considerations should be given to public moral outrage and a tendency towards egalitarianism, avoiding biased choices.
Should self-driving vehicles be prejudiced, e.g., deliberately harm the elderly over young children? When people make such forced-choices on the vehicle's behalf, they exhibit systematic preferences (e.g., favor young children), yet when their options are unconstrained they favor egalitarianism. So, which of these response patterns should guide AV programming and policy? We argue that this debate is missing the public reaction most likely to threaten the industry's life-saving potential: moral outrage. We find that people are more outraged by AVs that kill discriminately than indiscriminately. Crucially, they are even more outraged by an AV that deliberately kills a less preferred group (e.g., an elderly person over a child) than by one that indiscriminately kills a more preferred group (e.g., a child). Thus, at least insofar as the public is concerned, there may be more reason to depict and program AVs as egalitarian.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available