Journal
JOURNAL OF COMBINATORIAL OPTIMIZATION
Volume 45, Issue 3, Pages -Publisher
SPRINGER
DOI: 10.1007/s10878-023-01019-4
Keywords
Approximation algorithm; Group fairness; Submodular optimization
Ask authors/readers for more resources
This article focuses on the problem of algorithmic bias in data summarization applications, where the goal is to select a representative and diverse set of data items. The proposed approach introduces fairness-aware algorithms to ensure the proportional representation of different groups. The article also presents constant-factor approximation algorithms for the problem and extends the model to include a global size constraint.
Maximizing a submodular function has a wide range of applications in machine learning and data mining. One such application is data summarization whose goal is to select a small set of representative and diverse data items from a large dataset. However, data items might have sensitive attributes such as race or gender, in this setting, it is important to design fairness-aware algorithms to mitigate potential algorithmic bias that may cause over- or under- representation of particular groups. Motivated by that, we propose and study the classic non-monotone submodular maximization problem subject to novel group fairness constraints. Our goal is to select a set of items that maximizes a non-monotone submodular function, while ensuring that the number of selected items from each group is proportionate to its size, to the extent specified by the decision maker. We develop the first constant-factor approximation algorithms for this problem. We also extend the basic model to incorporate an additional global size constraint on the total number of selected items.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available