4.6 Article

Fair patient model: Mitigating bias in the patient representation learned from the electronic health records

Journal

JOURNAL OF BIOMEDICAL INFORMATICS
Volume 148, Issue -, Pages -

Publisher

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jbi.2023.104544

Keywords

-

Ask authors/readers for more resources

This study proposes a novel method to pre-train fair and unbiased patient representations from EHR data using a weighted loss function. The experimental results show that this method outperforms the baseline models in fairness metrics and achieves comparable predictive performance. The study also reveals that the method captures more information from clinical features.
Objective: To pre-train fair and unbiased patient representations from Electronic Health Records (EHRs) using a novel weighted loss function that reduces bias and improves fairness in deep representation learning models.Methods: We defined a new loss function, called weighted loss function, in the deep representation learning model to balance the importance of different groups of patients and features. We applied the proposed model, called Fair Patient Model (FPM), to a sample of 34,739 patients from the MIMIC-III dataset and learned patient representations for four clinical outcome prediction tasks.Results: FPM outperformed the baseline models in terms of three fairness metrics: demographic parity, equality of opportunity difference, and equalized odds ratio. FPM also achieved comparable predictive performance with the baselines, with an average accuracy of 0.7912. Feature analysis revealed that FPM captured more information from clinical features than the baselines.Conclusion: FPM is a novel method to pre-train fair and unbiased patient representations from the EHR data using a weighted loss function. The learned representations can be used for various downstream tasks in healthcare and can be extended to other domains where fairness is important.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available