Subgradient-Free Stochastic Optimization Algorithm for Non-smooth Convex Functions over Time-Varying Networks (1806.08537v1)
Abstract: In this paper we consider a distributed stochastic optimization problem without the gradient/subgradient information for the local objective functions, subject to local convex constraints. The objective functions may be non-smooth and observed with stochastic noises, and the network for the distributed design is time-varying. By adding the stochastic dithers into the local objective functions and constructing the randomized differences motivated by the Kiefer-Wolfowitz algorithm, we propose a distributed subgradient-free algorithm to find the global minimizer with local observations. Moreover, we prove that the consensus of estimates and global minimization can be achieved with probability one over the time-varying network, and then obtain the convergence rate of the mean average of estimates as well. Finally, we give a numerical example to illustrate the effectiveness of the proposed algorithm.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.