Emergent Mind

Abstract

Federated learning (FL) is a novel distributed learning framework designed for applications with privacy-sensitive data. Without sharing data, FL trains local models on individual devices and constructs the global model on the server by performing model aggregation. However, to reduce the communication cost, the participants in each training round are randomly selected, which significantly decreases the training efficiency under data and device heterogeneity. To address this issue, in this paper, we introduce a novel approach that considers the data distribution and computational resources of devices to select the clients for each training round. Our proposed method performs client selection based on the Grey Relational Analysis (GRA) theory by considering available computational resources for each client, the training loss, and weight divergence. To examine the usability of our proposed method, we implement our contribution on Amazon Web Services (AWS) by using the TensorFlow library of Python. We evaluate our algorithm's performance in different setups by varying the learning rate, network size, the number of selected clients, and the client selection round. The evaluation results show that our proposed algorithm enhances the performance significantly in terms of test accuracy and the average client's waiting time compared to state-of-the-art methods, federated averaging and Pow-d.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.