Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case (1905.13142v4)

Published 30 May 2019 in math.ST, math.PR, stat.ML, and stat.TH

Abstract: We consider the problem of sampling from a target distribution, which is \emph {not necessarily logconcave}, in the context of empirical risk minimization and stochastic optimization as presented in Raginsky et al. (2017). Non-asymptotic analysis results are established in the $L1$-Wasserstein distance for the behaviour of Stochastic Gradient Langevin Dynamics (SGLD) algorithms. We allow the estimation of gradients to be performed even in the presence of \emph{dependent} data streams. Our convergence estimates are sharper and \emph{uniform} in the number of iterations, in contrast to those in previous studies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ngoc Huy Chau (2 papers)
  2. Sotirios Sabanis (37 papers)
  3. Ying Zhang (389 papers)
  4. Éric Moulines (21 papers)
  5. Miklos Rásonyi (1 paper)
Citations (43)

Summary

We haven't generated a summary for this paper yet.