Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 49 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Fedstellar: A Platform for Decentralized Federated Learning (2306.09750v4)

Published 16 Jun 2023 in cs.LG, cs.AI, cs.DC, and cs.NI

Abstract: In 2016, Google proposed Federated Learning (FL) as a novel paradigm to train Machine Learning (ML) models across the participants of a federation while preserving data privacy. Since its birth, Centralized FL (CFL) has been the most used approach, where a central entity aggregates participants' models to create a global one. However, CFL presents limitations such as communication bottlenecks, single point of failure, and reliance on a central server. Decentralized Federated Learning (DFL) addresses these issues by enabling decentralized model aggregation and minimizing dependency on a central entity. Despite these advances, current platforms training DFL models struggle with key issues such as managing heterogeneous federation network topologies. To overcome these challenges, this paper presents Fedstellar, a platform extended from p2pfl library and designed to train FL models in a decentralized, semi-decentralized, and centralized fashion across diverse federations of physical or virtualized devices. The Fedstellar implementation encompasses a web application with an interactive graphical interface, a controller for deploying federations of nodes using physical or virtual devices, and a core deployed on each device which provides the logic needed to train, aggregate, and communicate in the network. The effectiveness of the platform has been demonstrated in two scenarios: a physical deployment involving single-board devices such as Raspberry Pis for detecting cyberattacks, and a virtualized deployment comparing various FL approaches in a controlled environment using MNIST and CIFAR-10 datasets. In both scenarios, Fedstellar demonstrated consistent performance and adaptability, achieving F1 scores of 91%, 98%, and 91.2% using DFL for detecting cyberattacks and classifying MNIST and CIFAR-10, respectively, reducing training time by 32% compared to centralized approaches.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (45)
  1. Agafonkin, V. (2021). Leaflet. Retrieved from https://leafletjs.com. Accessed 2023-06-03.
  2. P4L: Privacy preserving peer-to-peer learning for infrastructureless setups. arXiv preprint arXiv:2302.13438, .
  3. Machine learning with adversaries: Byzantine tolerant gradient descent. In Advances in Neural Information Processing Systems. volume 30.
  4. Bostock, M. (2021). D3.js - data-driven documents. Retrieved from https://d3js.org. Accessed 2023-05-20.
  5. DFedSN: Decentralized federated learning based on heterogeneous data in social networks. World Wide Web, . https://doi.org/10.1007/s11280-023-01152-4.
  6. Deng, L. (2012). The MNIST database of handwritten digit images for machine learning research. IEEE Signal Processing Magazine, 29, 141–142. https://doi.org/10.1109/MSP.2012.2211477.
  7. Falcon, W. (2019). PyTorch Lightning. Retrieved from https://github.com/Lightning-AI/lightning. Accessed 2023-06-10. https://doi.org/10.5281/zenodo.3828935.
  8. 2DF-IDS: Decentralized and differentially private federated learning-based intrusion detection system for industrial iot. Computers & Security, 127, 103097. https://doi.org/10.1016/j.cose.2023.103097.
  9. Google (2019a). Tensorboard. Retrieved from https://www.tensorflow.org/tensorboard. Accessed 2023-06-02.
  10. Google (2019b). TensorFlow Federated. Retrieved from https://www.tensorflow.org/federated. Accessed 2023-05-05.
  11. Guijas, P. (2022a). Desarrollo de una librería para el aprendizaje federado bajo una arquitectura peer-to-peer. ruc.udc.es, . Retrieved from https://ruc.udc.es/dspace/handle/2183/31955.
  12. Guijas, P. (2022b). p2p Federated Learning framework. Retrieved from https://github.com/pguijas/p2pfl.
  13. Jointly learning from decentralized (federated) and centralized data to mitigate distribution shift. In Proceedings of NeurIPS Workshop on Distribution Shifts.
  14. On the benefits of multiple gossip steps in communication-constrained decentralized federated learning. IEEE Transactions on Parallel and Distributed Systems, 33, 2727–2739. https://doi.org/10.1109/TPDS.2021.3138977.
  15. FedML: A research library and benchmark for federated machine learning. Advances in Neural Information Processing Systems, Best Paper Award at Federate Learning Workshop, .
  16. Hombashi, T. (2023). thombashi/tcconfig. Retrieved from https://github.com/thombashi/tcconfig. Accessed 2023-10-25.
  17. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861, .
  18. CyberSpec: Behavioral fingerprinting for intelligent attacks detection on crowdsensing spectrum sensors. IEEE Transactions on Dependable and Secure Computing, (pp. 1–14). https://doi.org/10.1109/TDSC.2023.3252918.
  19. Privacy-preserving and syscall-based intrusion detection system for iot spectrum sensors affected by data falsification attacks. IEEE Internet of Things Journal, 10, 8408–8415. https://doi.org/10.1109/JIOT.2022.3213889.
  20. Krizhevsky, A. (2009). Learning multiple layers of features from tiny images. Retrieved from https://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf. Accessed 2023-04-20.
  21. Robustness and personalization in federated learning: A unified approach via regularization. In 2022 IEEE International Conference on Edge Computing and Communications (EDGE) (pp. 1–11). https://doi.org/10.1109/EDGE55608.2022.00014.
  22. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86, 2278–2324. https://doi.org/10.1109/5.726791.
  23. Federated optimization in heterogeneous networks. In Proceedings of Machine Learning and Systems (pp. 429–450). volume 2.
  24. Decentralized federated learning: Balancing communication and computing costs. IEEE Transactions on Signal and Information Processing over Networks, 8, 131–143. https://doi.org/10.1109/TSIPN.2022.3151242.
  25. FATE: An industrial grade platform for collaborative learning with data protection. Journal of Machine Learning Research, 22, 1–6. Retrieved from http://jmlr.org/papers/v22/20-815.html.
  26. DEFEAT: A decentralized federated learning against gradient attacks. High-Confidence Computing, (p. 100128). https://doi.org/10.1016/j.hcc.2023.100128.
  27. enriquetomasmb/fedstellar. Retrieved from https://github.com/enriquetomasmb/fedstellar.
  28. Decentralized Federated Learning: Fundamentals, State of the Art, Frameworks, Trends, and Challenges. IEEE Communications Surveys & Tutorials, 25, 2983–3013. https://doi.org/10.1109/COMST.2023.3315746.
  29. Communication-efficient learning of deep networks from decentralized data. arXiv preprint arXiv:1602.05629, . https://doi.org/10.48550/ARXIV.1602.05629.
  30. Model-agnostic federated learning. In Euro-Par 2023: Parallel Processing (pp. 383–396).
  31. Experimenting with emerging risc-v systems for decentralised machine learning. In Proceedings of the 20th ACM International Conference on Computing Frontiers (p. 73–83). Association for Computing Machinery. https://doi.org/10.1145/3587135.3592211.
  32. Northern.tech (2022). Mender: Open source over-the-air software updates for linux devices. Retrieved from https://mender.io. Accessed 2023-06-10.
  33. Challenges in deploying machine learning: A survey of case studies. ACM Comput. Surv., 55. https://doi.org/10.1145/3533378.
  34. Robust aggregation for federated learning. IEEE Transactions on Signal Processing, 70, 1142–1154. https://doi.org/10.1109/TSP.2022.3153135.
  35. FL-SEC: Privacy-preserving decentralized federated learning using signsgd for the internet of artificially intelligent things. IEEE Internet of Things Magazine, 5, 85–90. https://doi.org/10.1109/IOTM.001.2100173.
  36. Electrosense: Open and big spectrum data. IEEE Communications Magazine, 56, 210–217. https://doi.org/10.1109/MCOM.2017.1700200.
  37. The digitization of the world from edge to core. Retrieved from https://www.seagate.com/files/www-content/our-story/trends/files/idc-seagate-dataage-whitepaper.pdf. Accessed 2023-04-20.
  38. Braintorrent: A peer-to-peer environment for decentralized federated learning. arXiv preprint arXiv:1905.06731, .
  39. A survey on device behavior fingerprinting: Data sources, techniques, application scenarios, and datasets. IEEE Communications Surveys & Tutorials, 23, 1048–1077. https://doi.org/10.1109/COMST.2021.3064259.
  40. Accelerating decentralized federated learning in heterogeneous edge computing. IEEE Transactions on Mobile Computing, (pp. 1–1). https://doi.org/10.1109/TMC.2022.3178378.
  41. Edge-based communication optimization for distributed federated learning. IEEE Transactions on Network Science and Engineering, (pp. 1–1). https://doi.org/10.1109/TNSE.2021.3083263.
  42. Scatterbrained: A flexible and expandable pattern for decentralized machine learning. arXiv preprint arXiv:2112.07718, .
  43. Zeno: Distributed stochastic gradient descent with suspicion-based fault-tolerance. In Proceedings of the 36th International Conference on Machine Learning (pp. 6893–6901). volume 97.
  44. Byzantine-robust distributed learning: Towards optimal statistical rates. In Proceedings of the 35th International Conference on Machine Learning (pp. 5650–5659). volume 80.
  45. Decentralized event-triggered federated learning with heterogeneous communication thresholds. In 2022 IEEE 61st Conference on Decision and Control (CDC) (pp. 4680–4687). https://doi.org/10.1109/CDC51059.2022.9993258.
Citations (27)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets