Fast parallel sampling under isoperimetry (2401.09016v1)
Abstract: We show how to sample in parallel from a distribution $\pi$ over $\mathbb Rd$ that satisfies a log-Sobolev inequality and has a smooth log-density, by parallelizing the Langevin (resp. underdamped Langevin) algorithms. We show that our algorithm outputs samples from a distribution $\hat\pi$ that is close to $\pi$ in Kullback--Leibler (KL) divergence (resp. total variation (TV) distance), while using only $\log(d){O(1)}$ parallel rounds and $\widetilde{O}(d)$ (resp. $\widetilde O(\sqrt d)$) gradient evaluations in total. This constitutes the first parallel sampling algorithms with TV distance guarantees. For our main application, we show how to combine the TV distance guarantees of our algorithms with prior works and obtain RNC sampling-to-counting reductions for families of discrete distribution on the hypercube ${\pm 1}n$ that are closed under exponential tilts and have bounded covariance. Consequently, we obtain an RNC sampler for directed Eulerian tours and asymmetric determinantal point processes, resolving open questions raised in prior works.
- Jason M. Altschuler and Sinho Chewi “Faster high-accuracy log-concave sampling via algorithmic warm starts” In 2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS), 2023, pp. 2169–2176
- “Fractionally log-concave and sector-stable polynomials: counting planar matchings and more” In Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2021 Virtual, Italy: Association for Computing Machinery, 2021, pp. 433–446
- “Sampling arborescences in parallel” In 12th Innovations in Theoretical Computer Science Conference, ITCS 2021, January 6-8, 2021, Virtual Conference 185, LIPIcs Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021, pp. 83:1–83:18
- “Parallel discrete sampling via continuous walks” In STOC’23—Proceedings of the 55th Annual ACM Symposium on Theory of Computing ACM, New York, 2023, pp. 103–116
- “Functional inequalities for Gaussian convolutions of compactly supported measures: explicit bounds and dimension dependence” In Bernoulli 24.1, 2018, pp. 333–353
- “More on zeros and approximation of the Ising partition function” In Forum of Mathematics, Sigma 9, 2021, pp. e46 Cambridge University Press
- “Diffusions hypercontractives” In Séminaire de Probabilités XIX 1983/84: Proceedings Springer, 2006, pp. 177–206
- Hong-Bin Chen, Sinho Chewi and Jonathan Niles-Weed “Dimension-free log-Sobolev inequalities for mixture distributions” In Journal of Functional Analysis 281.11, 2021, pp. 109236
- “Underdamped Langevin MCMC: a non-asymptotic analysis” In Proceedings of the 31st Conference on Learning Theory 75, Proceedings of Machine Learning Research PMLR, 2018, pp. 300–323
- “Analysis of Langevin Monte Carlo from Poincaré to log-Sobolev” In arXiv preprint 2112.12662, 2021
- Sinho Chewi “Log-concave sampling” Available online at https://chewisinho.github.io/ Forthcoming, 2023
- Arnak S. Dalalyan and Lionel Riou-Durand “On sampling from a log-concave density using kinetic Langevin diffusions” In Bernoulli 26.3, 2020, pp. 1956–1988
- “Log-concave sampling: Metropolis–Hastings algorithms are fast” In Journal of Machine Learning Research 20.183, 2019, pp. 1–42
- Jean-François Le Gall “Brownian motion, martingales, and stochastic calculus” 274, Graduate Texts in Mathematics Springer, [Cham], 2016, pp. xiii+273
- “Is there an analog of Nesterov acceleration for gradient-based MCMC?” In Bernoulli 27.3 Bernoulli Society for Mathematical StatisticsProbability, 2021, pp. 1942–1992
- Pierre Monmarché “An entropic approach for Hamiltonian Monte Carlo: the idealized case” In arXiv preprint 2209.13405, 2023
- “Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality” In J. Funct. Anal. 173.2, 2000, pp. 361–400
- Ruoqi Shen and Yin Tat Lee “The randomized midpoint method for log-concave sampling” In Advances in Neural Information Processing Systems 32, 2019
- Cédric Villani “Hypocoercivity” In Mem. Amer. Math. Soc. 202.950, 2009, pp. iv+141
- “Rapid convergence of the unadjusted Langevin algorithm: isoperimetry suffices” In Advances in Neural Information Processing Systems 32 Curran Associates, Inc., 2019, pp. 8094–8106
- “Improved discretization analysis for underdamped Langevin Monte Carlo” In Proceedings of Thirty Sixth Conference on Learning Theory 195, Proceedings of Machine Learning Research PMLR, 2023, pp. 36–71