Emergent Mind

Abstract

The advances in satellite technology developments have recently seen a large number of small satellites being launched into space on Low Earth orbit (LEO) to collect massive data such as Earth observational imagery. The traditional way which downloads such data to a ground station (GS) to train a ML model is not desirable due to the bandwidth limitation and intermittent connectivity between LEO satellites and the GS. Satellite edge computing (SEC), on the other hand, allows each satellite to train an ML model onboard and uploads only the model to the GS which appears to be a promising concept. This paper proposes FedLEO, a novel federated learning (FL) framework that realizes the concept of SEC and overcomes the limitation (slow convergence) of existing FL-based solutions. FedLEO (1) augments the conventional FL's star topology with horizontal'' intra-plane communication pathways in which model propagation among satellites takes place; (2) optimally schedules communication betweensink'' satellites and the GS by exploiting the predictability of satellite orbiting patterns. We evaluate FedLEO extensively and benchmark it with the state of the art. Our results show that FedLEO drastically expedites FL convergence, without sacrificing -- in fact it considerably increases -- the model accuracy.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.