Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedCM: Federated Learning with Client-level Momentum (2106.10874v1)

Published 21 Jun 2021 in cs.LG

Abstract: Federated Learning is a distributed machine learning approach which enables model training without data sharing. In this paper, we propose a new federated learning algorithm, Federated Averaging with Client-level Momentum (FedCM), to tackle problems of partial participation and client heterogeneity in real-world federated learning applications. FedCM aggregates global gradient information in previous communication rounds and modifies client gradient descent with a momentum-like term, which can effectively correct the bias and improve the stability of local SGD. We provide theoretical analysis to highlight the benefits of FedCM. We also perform extensive empirical studies and demonstrate that FedCM achieves superior performance in various tasks and is robust to different levels of client numbers, participation rate and client heterogeneity.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jing Xu (244 papers)
  2. Sen Wang (164 papers)
  3. Liwei Wang (239 papers)
  4. Andrew Chi-Chih Yao (16 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.