Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerated forward-backward and Douglas-Rachford splitting dynamics (2407.20620v2)

Published 30 Jul 2024 in math.OC, cs.LG, cs.SY, and eess.SY

Abstract: We examine convergence properties of continuous-time variants of accelerated Forward-Backward (FB) and Douglas-Rachford (DR) splitting algorithms for nonsmooth composite optimization problems. When the objective function is given by the sum of a quadratic and a nonsmooth term, we establish accelerated sublinear and exponential convergence rates for convex and strongly convex problems, respectively. Moreover, for FB splitting dynamics, we demonstrate that accelerated exponential convergence rate carries over to general strongly convex problems. In our Lyapunov-based analysis we exploit the variable-metric gradient interpretations of FB and DR splittings to obtain smooth Lyapunov functions that allow us to establish accelerated convergence rates. We provide computational experiments to demonstrate the merits and the effectiveness of our analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. A. Allibhoy and J. Cortés. Control barrier function-based design of gradient flows for constrained nonlinear programming. IEEE Trans. Automat. Control, 2023.
  2. H. Attouch and Z. Chbani. Fast inertial dynamics and FISTA algorithms in convex optimization. Perturbation aspects. arXiv preprint arXiv:1507.01367, 2015.
  3. Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics. J. Optim. Theory Appl., 193(1-3):704–736, 2022.
  4. Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Program., 168:123–175, 2018.
  5. P. Cisneros-Velarde and F. Bullo. A contraction theory approach to optimization algorithms from acceleration flows. In Proc. Int. Conf. Artif. Intell. Stat., pages 1321–1335, 2022.
  6. The proximal augmented Lagrangian method for nonsmooth composite optimization. IEEE Trans. Automat. Control, 64(7):2861–2868, 2019.
  7. Gradient flows and proximal splitting methods: A unified view on accelerated and stochastic optimization. Phys. Rev. E, 103(5):53304, 2021.
  8. P. Giselsson and M. Falt. Envelope functions: Unifications and further properties. J. Optim. Theory Appl., 178:673–698, 2018.
  9. An optimal algorithm for constrained differentiable convex optimization. SIAM J. Control Optim., 23(4):1939–1955, 2013.
  10. S. Hassan-Moghaddam and M. R. Jovanović. Proximal gradient flow and Douglas-Rachford splitting dynamics: Global exponential stability via integral quadratic constraints. Automatica J. IFAC, 123:109311, 2021.
  11. “Second-order primal”+“first-order dual” dynamical systems with time scaling for linear equality constrained convex optimization problems. IEEE Trans. Automat. Control, 67(8):4377–4383, 2022.
  12. T. Liu and T. K. Pong. Further properties of the forward–backward envelope with applications to difference-of-convex programming. Comput. Optim. Appl., 67:489–520, 2017.
  13. M. Muehlebach and M. I. Jordan. A dynamical systems perspective on Nesterov acceleration. In Proc. Int. Conf. Mach. Learn., pages 4656–4662, 2019.
  14. Y. Nesterov. A method of solving a convex programming problem with convergence rate O⁢(1/k2)𝑂1superscript𝑘2{O}(1/k^{2})italic_O ( 1 / italic_k start_POSTSUPERSCRIPT 2 end_POSTSUPERSCRIPT ). In Dokl. Akad. Nauk, volume 269, pages 543–547, 1983.
  15. Y. Nesterov. Introductory lectures on convex optimization: A basic course, volume 87. Springer, Berlin, 2003.
  16. Exponential convergence of primal-dual dynamics for multi-block problems under local error bound condition. In Proc. IEEE Conf. Decis. Control, pages 7579–7584, 2022.
  17. Tight lower bounds on the convergence rate of primal-dual dynamics for equality constrained convex problems. In Proc. IEEE Conf. Decis. Control, pages 7318–7323, 2023.
  18. B. O’Donoghue and E. Candes. Adaptive restart for accelerated gradient schemes. Found. Comput. Math., 15:715–732, 2015.
  19. N. Parikh and S. Boyd. Proximal algorithms. Found. Trends Opt., 1(3):123–231, 2014.
  20. Douglas-Rachford splitting: Complexity estimates and accelerated variants. In Proc. IEEE Conf. Decision Control, pages 4234–4239, 2014.
  21. Forward-backward truncated Newton methods for convex composite optimization. arXiv:1402.6655, 2014.
  22. B. T. Polyak. Some methods of speeding up the convergence of iteration methods. Comput. Math. Phys., 4(5):1–17, 1964.
  23. E. K Ryu and S. Boyd. Primer on monotone operator methods. Appl. comput. math, 15(1):3–43, 2016.
  24. A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. J. Mach. Learn. Res., 17(153):1–43, 2016.
  25. A variational perspective on accelerated methods in optimization. Proc. Natl. Acad. Sci. USA, 113(47):E7351–E7358, 2016.
  26. A Lyapunov analysis of accelerated methods in optimization. J. Mach. Learn. Res., 22(1):5040–5073, 2021.
  27. Dynamical primal-dual Nesterov accelerated method and its application to network optimization. IEEE Trans. Automat. Control, 68(3):1760–1767, 2023.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com