Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Diffusion Bridge Mixture Transports, Schrödinger Bridge Problems and Generative Modeling (2304.00917v2)

Published 3 Apr 2023 in stat.ML and cs.LG

Abstract: The dynamic Schr\"odinger bridge problem seeks a stochastic process that defines a transport between two target probability measures, while optimally satisfying the criteria of being closest, in terms of Kullback-Leibler divergence, to a reference process. We propose a novel sampling-based iterative algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at solving the dynamic Schr\"odinger bridge problem. The IDBM procedure exhibits the attractive property of realizing a valid transport between the target probability measures at each iteration. We perform an initial theoretical investigation of the IDBM procedure, establishing its convergence properties. The theoretical findings are complemented by numerical experiments illustrating the competitive performance of the IDBM procedure. Recent advancements in generative modeling employ the time-reversal of a diffusion process to define a generative process that approximately transports a simple distribution to the data distribution. As an alternative, we propose utilizing the first iteration of the IDBM procedure as an approximation-free method for realizing this transport. This approach offers greater flexibility in selecting the generative process dynamics and exhibits accelerated training and superior sample quality over larger discretization intervals. In terms of implementation, the necessary modifications are minimally intrusive, being limited to the training loss definition.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (74)
  1. Brian D.O. Anderson. Reverse-Time Diffusion Equation Models. Stochastic Processes and their Applications, 12(3):313–326, May 1982.
  2. Yacine Aït-Sahalia. Closed-form likelihood expansions for multivariate diffusions. The Annals of Statistics, 36(2):906–937, 2008. ISSN 0090-5364, 2168-8966.
  3. Schrödinger Bridge Samplers, 2019.
  4. Exact simulation of diffusions. The Annals of Applied Probability, 15(4):2422–2444, 2005. ISSN 1050-5164, 2168-8737.
  5. A Factorisation of Diffusion Measure and Finite Sample Path Constructions. Methodology and Computing in Applied Probability, 10(1):85–104, 2008. ISSN 1573-7713.
  6. Patrick Billingsley. Convergence of Probability Measures. Wiley Series in Probability and Statistics. Probability and Statistics Section. Wiley, 2nd ed edition, 1999.
  7. Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling. In Thirty-Fifth Conference on Neural Information Processing Systems, 2021.
  8. Damiano Brigo. The General Mixture-Diffusion SDE and Its Relationship with an Uncertain-Volatility Option Model with Volatility-Asset Decorrelation, December 2002.
  9. Handbook of Markov Chain Monte Carlo. CRC press, 2011.
  10. Likelihood Training of Schrödinger Bridge using Forward-Backward SDEs Theory. In International Conference on Learning Representations, 2022.
  11. I. Csiszar. I-Divergence Geometry of Probability Distributions and Minimization Problems. The Annals of Probability, 3(1):146–158, 1975. ISSN 0091-1798.
  12. Paolo Dai Pra. A stochastic control approach to reciprocal diffusion processes. Applied Mathematics and Optimization, 23(1):313–329, 1991. ISSN 1432-0606.
  13. Simulating Diffusion Bridges with Score Matching, 2021.
  14. On a Least Squares Adjustment of a Sampled Frequency Table When the Expected Marginal Totals are Known. The Annals of Mathematical Statistics, 11(4):427–444, 1940. ISSN 0003-4851.
  15. Diffusion Models Beat GANs on Image Synthesis. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, 2021.
  16. Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn’s Algorithm. In Proceedings of the 35th International Conference on Machine Learning, pages 1367–1376. PMLR, 2018.
  17. Bayesian inference with optimal maps. Journal of Computational Physics, 231(23):7815–7850, 2012. ISSN 0021-9991.
  18. Shooting Schrödinger’s Cat. In Fourth Symposium on Advances in Approximate Bayesian Inference, 2022.
  19. POT: Python Optimal Transport. Journal of Machine Learning Research, 22(78):1–8, 2021. ISSN 1533-7928.
  20. Robert Fortet. Résolution d’un systeme d’équations de M. Schrödinger. J. Math. Pure Appl. IX, 1:83–105, 1940.
  21. Avner Friedman. Stochastic Differential Equations and Applications - Volume 1. Probability and Mathematical Statistics Series ; v. 28. Academic Press, 1975.
  22. Christiane Fuchs. Inference for Diffusion Processes: With Applications in Life Sciences. Springer Science & Business Media, 2013.
  23. U. G. Haussmann and E. Pardoux. Time Reversal of Diffusions. The Annals of Probability, 14(4):1188–1205, October 1986.
  24. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium. In Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
  25. Denoising Diffusion Probabilistic Models. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 6840–6851, 2020.
  26. Simple diffusion: End-to-end diffusion for high resolution images, 2023.
  27. Schrödinger-Föllmer Sampler: Sampling without Ergodicity, 2021.
  28. Seeing the Wood for the Trees: A Critical Evaluation of Methods to Estimate the Parameters of Stochastic Differential Equations. Journal of Financial Econometrics, 5(3):390–455, 2007. ISSN 1479-8409.
  29. Aapo Hyvärinen. Estimation of Non-Normalized Statistical Models by Score Matching. Journal of Machine Learning Research, 6(24):695–709, 2005. ISSN 1533-7928.
  30. Benton Jamison. Reciprocal processes. Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, 30(1):65–86, 1974. ISSN 1432-2064.
  31. Benton Jamison. The Markov processes of Schrödinger. Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, 32(4):323–331, 1975. ISSN 1432-2064.
  32. Entropic Optimal Transport between Unbalanced Gaussian Measures has a Closed Form. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 10468–10479. Curran Associates, Inc., 2020.
  33. Statistical Methods for Stochastic Differential Equations. Chapman and Hall/CRC, 2012.
  34. Numerical Solution of Stochastic Differential Equations. Springer Berlin Heidelberg, Berlin, Heidelberg, 1992. ISBN 978-3-642-08107-1 978-3-662-12616-5.
  35. Flow Matching for Generative Modeling. In The Eleventh International Conference on Learning Representations, 2023.
  36. Statistics of Random Processes: General Theory, volume 394. Springer, 1977.
  37. I²SB: Image-to-Image Schrödinger Bridge, 2023a.
  38. Qiang Liu. Rectified Flow: A Marginal Preserving Approach to Optimal Transport, 2022.
  39. Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow, 2022.
  40. Learning Diffusion Bridges on Constrained Domains. In The Eleventh International Conference on Learning Representations, 2023b.
  41. Christian Léonard. Some Properties of Path Measures. In Catherine Donati-Martin, Antoine Lejay, and Alain Rouault, editors, Séminaire de Probabilités XLVI, Lecture Notes in Mathematics, pages 207–230. Springer International Publishing, 2014a.
  42. Christian Léonard. A survey of the Schrödinger problem and some of its connections with optimal transport. Discrete & Continuous Dynamical Systems, 34(4):1533, 2014b.
  43. Entropy-regularized 2-Wasserstein distance between Gaussian measures. Information Geometry, 5(1):289–323, 2022. ISSN 2511-249X.
  44. Sampling via Measure Transport: An Introduction. In Roger Ghanem, David Higdon, and Houman Owhadi, editors, Handbook of Uncertainty Quantification, pages 1–41. Springer International Publishing, 2016.
  45. Integration by Parts and Time Reversal for Diffusion Processes. The Annals of Probability, pages 208–238, 1989.
  46. Improved Denoising Diffusion Probabilistic Models. In Proceedings of the 38th International Conference on Machine Learning, pages 8162–8171. PMLR, 2021.
  47. Frank Nielsen. What is an information projection. Notices of the AMS, 65(3):321–324, 2018.
  48. Normalizing Flows for Probabilistic Modeling and Inference. Journal of Machine Learning Research, 22(57):1–64, 2021. ISSN 1533-7928.
  49. The data-driven Schrödinger bridge, 2018.
  50. Stefano Peluchetti. Non-Denoising Forward-Time Diffusions. 2021.
  51. A Study of the Efficiency of Exact Methods for Diffusion Simulation. In Leszek Plaskota and Henryk Woźniakowski, editors, Monte Carlo and Quasi-Monte Carlo Methods 2010, Springer Proceedings in Mathematics & Statistics, pages 161–187. Springer, 2012.
  52. Computational Optimal Transport. 2020.
  53. Diffusions, Markov Processes and Martingales: Volume 2: Itô Calculus, volume 2. Cambridge university press, 2000.
  54. High-Resolution Image Synthesis with Latent Diffusion Models. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2022.
  55. Ludger Ruschendorf. Convergence of the Iterative Proportional Fitting Procedure. The Annals of Statistics, 23(4):1160–1174, 1995. ISSN 0090-5364.
  56. Unbiased estimation using a class of diffusion processes. Journal of Computational Physics, 472:111643, 2023. ISSN 0021-9991.
  57. L. Rüschendorf and W. Thomsen. Note on the Schrödinger equation and I-projections. Statistics & Probability Letters, 17(5):369–375, 1993. ISSN 0167-7152.
  58. Diffusion Schrödinger Bridge Matching, 2023.
  59. Aligned Diffusion Schrödinger Bridges. In Proceedings of the Thirty-Ninth Conference on Uncertainty in Artificial Intelligence, pages 1985–1995. PMLR, 2023.
  60. Score-Based Generative Modeling through Stochastic Differential Equations. In International Conference on Learning Representations, 2021.
  61. Multidimensional Diffusion Processes. 2006.
  62. Applied Stochastic Differential Equations. Cambridge University Press, first edition, April 2019. ISBN 978-1-108-18673-5 978-1-316-51008-7 978-1-316-64946-6.
  63. Riemannian Diffusion Schrödinger Bridge, 2022.
  64. Conditional Flow Matching: Simulation-Free Dynamic Optimal Transport, 2023.
  65. Theoretical guarantees for sampling and inference in generative models with latent diffusions. In Proceedings of the Thirty-Second Conference on Learning Theory, pages 3084–3114. PMLR, 2019.
  66. Tim van Erven and Peter Harremos. Rényi Divergence and Kullback-Leibler Divergence. IEEE Transactions on Information Theory, 60(7):3797–3820, 2014. ISSN 1557-9654.
  67. Solving Schrödinger Bridges via Maximum Likelihood. Entropy, 23(9):1134, September 2021.
  68. Bayesian learning via neural Schrödinger–Föllmer flows. Statistics and Computing, 33(1):3, 2022. ISSN 1573-1375.
  69. Pascal Vincent. A Connection Between Score Matching and Denoising Autoencoders. Neural Computation, 23(7):1661–1674, July 2011.
  70. Deep Generative Learning via Schrödinger Bridge. In Marina Meila and Tong Zhang, editors, Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pages 10794–10804. PMLR, 2021.
  71. Diffusion-based Molecule Generation with Informative Prior Bridges. In Advances in Neural Information Processing Systems, 2022.
  72. First Hitting Diffusion Models for Generating Manifold, Graph and Categorical Data. In Advances in Neural Information Processing Systems, 2022.
  73. Path Integral Sampler: A Stochastic Control Approach For Sampling. In International Conference on Learning Representations, 2022.
  74. B. K. Øksendal. Stochastic Differential Equations: An Introduction with Applications. Universitext. Springer, Berlin ; New York, 6th ed. edition, 2003. ISBN 978-3-540-04758-2.
Citations (38)

Summary

We haven't generated a summary for this paper yet.