Papers
Topics
Authors
Recent
2000 character limit reached

Convergence Rate Bounds for the Mirror Descent Method: IQCs and the Bregman Divergence (2204.00502v2)

Published 1 Apr 2022 in math.OC, cs.SY, and eess.SY

Abstract: This paper is concerned with convergence analysis for the mirror descent (MD) method, a well-known algorithm in convex optimization. An analysis framework via integral quadratic constraints (IQCs) is constructed to analyze the convergence rate of the MD method with strongly convex objective functions in both continuous-time and discrete-time. We formulate the problem of finding convergence rates of the MD algorithms into feasibility problems of linear matrix inequalities (LMIs) in both schemes. In particular, in continuous-time, we show that the Bregman divergence function, which is commonly used as a Lyapunov function for this algorithm, is a special case of the class of Lyapunov functions associated with the Popov criterion, when the latter is applied to an appropriate reformulation of the problem. Thus, applying the Popov criterion and its combination with other IQCs, can lead to convergence rate bounds with reduced conservatism. We also illustrate via examples that the convergence rate bounds derived can be tight.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.