TY - JOUR
AU - Truhn, Daniel
AU - Tayebi Arasteh, Soroosh
AU - Saldanha, Oliver Lester
AU - Müller-Franzes, Gustav
AU - Khader, Firas
AU - Quirke, Philip
AU - West, Nicholas P
AU - Gray, Richard
AU - Hutchins, Gordon G A
AU - James, Jacqueline A
AU - Loughrey, Maurice B
AU - Salto-Tellez, Manuel
AU - Brenner, Hermann
AU - Brobeil, Alexander
AU - Yuan, Tanwei
AU - Chang-Claude, Jenny
AU - Hoffmeister, Michael
AU - Foersch, Sebastian
AU - Han, Tianyu
AU - Keil, Sebastian
AU - Schulze-Hagen, Maximilian
AU - Isfort, Peter
AU - Bruners, Philipp
AU - Kaissis, Georgios
AU - Kuhl, Christiane
AU - Nebelung, Sven
AU - Kather, Jakob Nikolas
TI - Encrypted federated learning for secure decentralized collaboration in cancer image analysis.
JO - Medical image analysis
VL - 92
SN - 1361-8415
CY - Amsterdam [u.a.]
PB - Elsevier Science
M1 - DKFZ-2023-02741
SP - 103059
PY - 2023
AB - Artificial intelligence (AI) has a multitude of applications in cancer research and oncology. However, the training of AI systems is impeded by the limited availability of large datasets due to data protection requirements and other regulatory obstacles. Federated and swarm learning represent possible solutions to this problem by collaboratively training AI models while avoiding data transfer. However, in these decentralized methods, weight updates are still transferred to the aggregation server for merging the models. This leaves the possibility for a breach of data privacy, for example by model inversion or membership inference attacks by untrusted servers. Somewhat-homomorphically-encrypted federated learning (SHEFL) is a solution to this problem because only encrypted weights are transferred, and model updates are performed in the encrypted space. Here, we demonstrate the first successful implementation of SHEFL in a range of clinically relevant tasks in cancer image analysis on multicentric datasets in radiology and histopathology. We show that SHEFL enables the training of AI models which outperform locally trained models and perform on par with models which are centrally trained. In the future, SHEFL can enable multiple institutions to co-train AI models without forsaking data governance and without ever transmitting any decryptable data to untrusted servers.
KW - Artificial intelligence (Other)
KW - Federated learning (Other)
KW - Histopathology (Other)
KW - Homomorphic encryption (Other)
KW - Privacy-preserving deep learning (Other)
KW - Radiology (Other)
LB - PUB:(DE-HGF)16
C6 - pmid:38104402
DO - DOI:10.1016/j.media.2023.103059
UR - https://inrepo02.dkfz.de/record/286250
ER -