Emergent Mind

Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise

(2405.14285)
Published May 23, 2024 in stat.ML , cs.LG , and math.OC

Abstract

We study stochastic approximation algorithms with Markovian noise and constant step-size $\alpha$. We develop a method based on infinitesimal generator comparisons to study the bias of the algorithm, which is the expected difference between $\thetan$ -- the value at iteration $n$ -- and $\theta*$ -- the unique equilibrium of the corresponding ODE. We show that, under some smoothness conditions, this bias is of order $O(\alpha)$. Furthermore, we show that the time-averaged bias is equal to $\alpha V + O(\alpha2)$, where $V$ is a constant characterized by a Lyapunov equation, showing that $\esp{\bar{\theta}n} \approx \theta*+V\alpha + O(\alpha2)$, where $\bar{\theta}n=(1/n)\sum{k=1}n\theta_k$ is the Polyak-Ruppert average. We also show that $\bar{\theta}_n$ converges with high probability around $\theta*+\alpha V$. We illustrate how to combine this with Richardson-Romberg extrapolation to derive an iterative scheme with a bias of order $O(\alpha2)$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.