Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Privacy-Preserving Finite-Time Push-Sum based Gradient Method for Distributed Optimization over Digraphs (2305.15202v2)

Published 24 May 2023 in math.OC, cs.SY, and eess.SY

Abstract: This paper addresses the problem of distributed optimization, where a network of agents represented as a directed graph (digraph) aims to collaboratively minimize the sum of their individual cost functions. Existing approaches for distributed optimization over digraphs, such as Push-Pull, require agents to exchange explicit state values with their neighbors in order to reach an optimal solution. However, this can result in the disclosure of sensitive and private information. To overcome this issue, we propose a state-decomposition-based privacy-preserving finite-time push-sum (PrFTPS) algorithm without any global information, such as network size or graph diameter. Then, based on PrFTPS, we design a gradient descent algorithm (PrFTPS-GD) to solve the distributed optimization problem. It is proved that under PrFTPS-GD, the privacy of each agent is preserved and the linear convergence rate related to the optimization iteration number is achieved. Finally, numerical simulations are provided to illustrate the effectiveness of the proposed approach.

Citations (1)

Summary

We haven't generated a summary for this paper yet.