Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantized Decentralized Stochastic Learning over Directed Graphs (2002.09964v6)

Published 23 Feb 2020 in cs.DC, cs.LG, cs.MA, cs.SY, eess.SP, and eess.SY

Abstract: We consider a decentralized stochastic learning problem where data points are distributed among computing nodes communicating over a directed graph. As the model size gets large, decentralized learning faces a major bottleneck that is the heavy communication load due to each node transmitting large messages (model updates) to its neighbors. To tackle this bottleneck, we propose the quantized decentralized stochastic learning algorithm over directed graphs that is based on the push-sum algorithm in decentralized consensus optimization. More importantly, we prove that our algorithm achieves the same convergence rates of the decentralized stochastic learning algorithm with exact-communication for both convex and non-convex losses. Numerical evaluations corroborate our main theoretical results and illustrate significant speed-up compared to the exact-communication methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Hossein Taheri (22 papers)
  2. Aryan Mokhtari (95 papers)
  3. Hamed Hassani (120 papers)
  4. Ramtin Pedarsani (82 papers)
Citations (52)

Summary

We haven't generated a summary for this paper yet.