Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Marvels and Pitfalls of the Langevin Algorithm in Noisy High-dimensional Inference (1812.09066v4)

Published 21 Dec 2018 in cs.LG, cond-mat.dis-nn, math.ST, stat.ML, and stat.TH

Abstract: Gradient-descent-based algorithms and their stochastic versions have widespread applications in machine learning and statistical inference. In this work we perform an analytic study of the performances of one of them, the Langevin algorithm, in the context of noisy high-dimensional inference. We employ the Langevin algorithm to sample the posterior probability measure for the spiked matrix-tensor model. The typical behaviour of this algorithm is described by a system of integro-differential equations that we call the Langevin state evolution, whose solution is compared with the one of the state evolution of approximate message passing (AMP). Our results show that, remarkably, the algorithmic threshold of the Langevin algorithm is sub-optimal with respect to the one given by AMP. We conjecture this phenomenon to be due to the residual glassiness present in that region of parameters. Finally we show how a landscape-annealing protocol, that uses the Langevin algorithm but violate the Bayes-optimality condition, can approach the performance of AMP.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Stefano Sarao Mannelli (21 papers)
  2. Giulio Biroli (131 papers)
  3. Chiara Cammarota (30 papers)
  4. Florent Krzakala (179 papers)
  5. Pierfrancesco Urbani (55 papers)
  6. Lenka Zdeborová (182 papers)
Citations (42)

Summary

We haven't generated a summary for this paper yet.