Emergent Mind

Abstract

In this paper we consider a distributed stochastic optimization problem without the gradient/subgradient information for the local objective functions, subject to local convex constraints. The objective functions may be non-smooth and observed with stochastic noises, and the network for the distributed design is time-varying. By adding the stochastic dithers into the local objective functions and constructing the randomized differences motivated by the Kiefer-Wolfowitz algorithm, we propose a distributed subgradient-free algorithm to find the global minimizer with local observations. Moreover, we prove that the consensus of estimates and global minimization can be achieved with probability one over the time-varying network, and then obtain the convergence rate of the mean average of estimates as well. Finally, we give a numerical example to illustrate the effectiveness of the proposed algorithm.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.