Papers
Topics
Authors
Recent
2000 character limit reached

Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise (2405.14285v2)

Published 23 May 2024 in stat.ML, cs.LG, and math.OC

Abstract: We study stochastic approximation algorithms with Markovian noise and constant step-size $\alpha$. We develop a method based on infinitesimal generator comparisons to study the bias of the algorithm, which is the expected difference between $\theta_n$ -- the value at iteration $n$ -- and $\theta*$ -- the unique equilibrium of the corresponding ODE. We show that, under some smoothness conditions, this bias is of order $O(\alpha)$. Furthermore, we show that the time-averaged bias is equal to $\alpha V + O(\alpha2)$, where $V$ is a constant characterized by a Lyapunov equation, showing that $\mathbb{E}[\bar{\theta}n] \approx \theta*+V\alpha + O(\alpha2)$, where $\bar{\theta}_n=(1/n)\sum{k=1}n\theta_k$ is the Polyak-Ruppert average. We also show that $\bar{\theta}_n$ converges with high probability around $\theta*+\alpha V$. We illustrate how to combine this with Richardson-Romberg extrapolation to derive an iterative scheme with a bias of order $O(\alpha2)$.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 3 tweets and received 3 likes.

Upgrade to Pro to view all of the tweets about this paper: