4.6 Article

Epistemic injustice and data science technologies

期刊

SYNTHESE
卷 200, 期 2, 页码 -

出版社

SPRINGER
DOI: 10.1007/s11229-022-03631-z

关键词

Data science; Epistemic injustice; Epistemic opacity; Artificial intelligence; Big data

资金

  1. NSA Science of Security initiative contract [H98230-18-D-0009]

向作者/读者索取更多资源

Technologies that utilize data science methods can cause epistemic harms which can be unjust. It is important to recognize and address these harms. Through examples from the criminal justice system, workplace hierarchies, and educational contexts, we explain the types of epistemic injustices that can result from common uses of data science technologies.
Technologies that deploy data science methods are liable to result in epistemic harms involving the diminution of individuals with respect to their standing as knowers or their credibility as sources of testimony. Not all harms of this kind are unjust but when they are we ought to try to prevent or correct them. Epistemically unjust harms will typically intersect with other more familiar and well-studied kinds of harm that result from the design, development, and use of data science technologies. However, we argue that epistemic injustices can be distinguished conceptually from more familiar kinds of harm. We argue that epistemic harms are morally relevant even in cases where those who suffer them are unharmed in other ways. Via a series of examples from the criminal justice system, workplace hierarchies, and educational contexts we explain the kinds of epistemic injustice that can result from common uses of data science technologies.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据