COVID-19 Imaging Data Privacy by Federated Learning Design: A Theoretical Framework

13 Oct 2020  ·  Anwaar Ulhaq, Oliver Burmeister ·

To address COVID-19 healthcare challenges, we need frequent sharing of health data, knowledge and resources at a global scale. However, in this digital age, data privacy is a big concern that requires the secure embedding of privacy assurance into the design of all technological solutions that use health data. In this paper, we introduce differential privacy by design (dPbD) framework and discuss its embedding into the federated machine learning system. To limit the scope of our paper, we focus on the problem scenario of COVID-19 imaging data privacy for disease diagnosis by computer vision and deep learning approaches. We discuss the evaluation of the proposed design of federated machine learning systems and discuss how differential privacy by design (dPbD) framework can enhance data privacy in federated learning systems with scalability and robustness. We argue that scalable differentially private federated learning design is a promising solution for building a secure, private and collaborative machine learning model such as required to combat COVID19 challenge.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here