Papers
arxiv:2605.08198

FairHealth: An Open-Source Python Library for Trustworthy Healthcare AI in Low-Resource Settings

Published on May 5
Authors:

Abstract

FairHealth is an open-source Python library that offers a modular framework for trustworthy machine learning in healthcare, addressing fairness, privacy, explainability, and global health data needs.

AI-generated summary

We present FairHealth, an open-source Python library that provides a unified, modular framework for trustworthy machine learning in healthcare applications, with particular focus on low-resource and low-income country (LMIC) settings such as Bangladesh. FairHealth addresses four critical gaps in existing healthcare AI toolkits: (1) the absence of integrated fairness auditing for biosignals and clinical tabular data; (2) the lack of privacy-preserving federated learning tools compatible with standard ML workflows; (3) missing explainability tools tailored for low-bandwidth clinical decision support; and (4) no existing toolkit covering Global South healthcare datasets. Built from five peer-reviewed research contributions, FairHealth provides six modules covering federated learning with homomorphic encryption (fairhealth.federated), intersectional fairness metrics (fairhealth.fairness), hybrid fuzzy-SHAP explainability (fairhealth.explain), multilingual dengue triage (fairhealth.lowresource), equitable disaster aid allocation (fairhealth.equity), and public dataset loaders (fairhealth.datasets). All datasets used are publicly available without institutional data use agreements. FairHealth is installable via pip install fairhealth(PyPI: pypi.org/project/fairhealth/) and available at https://github.com/Farjana-Yesmin/fairhealth.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2605.08198
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2605.08198 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2605.08198 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2605.08198 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.