3.8 Proceedings Paper

Did You Do Your Homework? Raising Awareness on Software Fairness and Discrimination

出版社

IEEE
DOI: 10.1109/ASE51524.2021.9678568

关键词

software fairness; discrimination; classification

资金

  1. ERC [741278]
  2. European Research Council (ERC) [741278] Funding Source: European Research Council (ERC)

向作者/读者索取更多资源

Machine Learning plays a crucial role in modern decision-making software, but bias is a concern. Achieving fairness involves treating sensitive attributes equally, while also considering discrimination against anti-protected attributes. By using grid search, it is possible to reduce bias while improving the ability to discriminate against anti-protected attributes.
Machine Learning is a vital part of various modern day decision making software. At the same time, it has shown to exhibit bias, which can cause an unjust treatment of individuals and population groups. One method to achieve fairness in machine learning software is to provide individuals with the same degree of benefit, regardless of sensitive attributes (e.g., students receive the same grade, independent of their sex or race). However, there can be other attributes that one might want to discriminate against (e.g., students with homework should receive higher grades). We will call such attributes anti-protected attributes. When reducing the bias of machine learning software, one risks the loss of discriminatory behaviour of anti-protected attributes. To combat this, we use grid search to show that machine learning software can be debiased (e.g., reduce gender bias) while also improving the ability to discriminate against anti-protected attributes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据