Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Federated Learning in Multi-Center Critical Care Research: A Systematic Case Study using the eICU Database (2204.09328v1)

Published 20 Apr 2022 in cs.LG and stat.ML

Abstract: Federated learning (FL) has been proposed as a method to train a model on different units without exchanging data. This offers great opportunities in the healthcare sector, where large datasets are available but cannot be shared to ensure patient privacy. We systematically investigate the effectiveness of FL on the publicly available eICU dataset for predicting the survival of each ICU stay. We employ Federated Averaging as the main practical algorithm for FL and show how its performance changes by altering three key hyper-parameters, taking into account that clients can significantly vary in size. We find that in many settings, a large number of local training epochs improves the performance while at the same time reducing communication costs. Furthermore, we outline in which settings it is possible to have only a low number of hospitals participating in each federated update round. When many hospitals with low patient counts are involved, the effect of overfitting can be avoided by decreasing the batchsize. This study thus contributes toward identifying suitable settings for running distributed algorithms such as FL on clinical datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Arash Mehrjou (39 papers)
  2. Ashkan Soleymani (13 papers)
  3. Annika Buchholz (3 papers)
  4. Jürgen Hetzel (3 papers)
  5. Patrick Schwab (27 papers)
  6. Stefan Bauer (102 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.