Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semi-Stochastic Coordinate Descent (1412.6293v1)

Published 19 Dec 2014 in cs.NA and math.OC

Abstract: We propose a novel stochastic gradient method---semi-stochastic coordinate descent (S2CD)---for the problem of minimizing a strongly convex function represented as the average of a large number of smooth convex functions: $f(x)=\tfrac{1}{n}\sum_i f_i(x)$. Our method first performs a deterministic step (computation of the gradient of $f$ at the starting point), followed by a large number of stochastic steps. The process is repeated a few times, with the last stochastic iterate becoming the new starting point where the deterministic step is taken. The novelty of our method is in how the stochastic steps are performed. In each such step, we pick a random function $f_i$ and a random coordinate $j$---both using nonuniform distributions---and update a single coordinate of the decision vector only, based on the computation of the $j{th}$ partial derivative of $f_i$ at two different points. Each random step of the method constitutes an unbiased estimate of the gradient of $f$ and moreover, the squared norm of the steps goes to zero in expectation, meaning that the stochastic estimate of the gradient progressively improves. The complexity of the method is the sum of two terms: $O(n\log(1/\epsilon))$ evaluations of gradients $\nabla f_i$ and $O(\hat{\kappa}\log(1/\epsilon))$ evaluations of partial derivatives $\nabla_j f_i$, where $\hat{\kappa}$ is a novel condition number.

Citations (82)

Summary

We haven't generated a summary for this paper yet.