View a PDF of the paper titled Inclusive, Differentially Private Federated Learning for Clinical Data, by Santhosh Parampottupadam and 9 other authors
View PDF
HTML (experimental)
Abstract:Federated Learning (FL) offers a promising approach for training clinical AI models without centralizing sensitive patient data. However, its real-world adoption is hindered by challenges related to privacy, resource constraints, and compliance. Existing Differential Privacy (DP) approaches often apply uniform noise, which disproportionately degrades model performance, even among well-compliant institutions. In this work, we propose a novel compliance-aware FL framework that enhances DP by adaptively adjusting noise based on quantifiable client compliance scores. Additionally, we introduce a compliance scoring tool based on key healthcare and security standards to promote secure, inclusive, and equitable participation across diverse clinical settings. Extensive experiments on public datasets demonstrate that integrating under-resourced, less compliant clinics with highly regulated institutions yields accuracy improvements of up to 15% over traditional FL. This work advances FL by balancing privacy, compliance, and performance, making it a viable solution for real-world clinical workflows in global healthcare.
Submission history
From: Santhosh Parampottupadam [view email]
[v1]
Wed, 28 May 2025 08:36:21 UTC (774 KB)
[v2]
Thu, 5 Jun 2025 09:01:10 UTC (774 KB)