Emergent Mind

Semi-Stochastic Coordinate Descent

(1412.6293)
Published Dec 19, 2014 in cs.NA and math.OC

Abstract

We propose a novel stochastic gradient methodsemi-stochastic coordinate descent (S2CD)for the problem of minimizing a strongly convex function represented as the average of a large number of smooth convex functions: $f(x)=\tfrac{1}{n}\sumi fi(x)$. Our method first performs a deterministic step (computation of the gradient of $f$ at the starting point), followed by a large number of stochastic steps. The process is repeated a few times, with the last stochastic iterate becoming the new starting point where the deterministic step is taken. The novelty of our method is in how the stochastic steps are performed. In each such step, we pick a random function $fi$ and a random coordinate $j$both using nonuniform distributionsand update a single coordinate of the decision vector only, based on the computation of the $j{th}$ partial derivative of $fi$ at two different points. Each random step of the method constitutes an unbiased estimate of the gradient of $f$ and moreover, the squared norm of the steps goes to zero in expectation, meaning that the stochastic estimate of the gradient progressively improves. The complexity of the method is the sum of two terms: $O(n\log(1/\epsilon))$ evaluations of gradients $\nabla fi$ and $O(\hat{\kappa}\log(1/\epsilon))$ evaluations of partial derivatives $\nablaj f_i$, where $\hat{\kappa}$ is a novel condition number.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.