Emergent Mind

Optimization Algorithm Synthesis based on Integral Quadratic Constraints: A Tutorial

(2306.00565)
Published Jun 1, 2023 in math.OC , cs.SY , and eess.SY

Abstract

We expose in a tutorial fashion the mechanisms which underlie the synthesis of optimization algorithms based on dynamic integral quadratic constraints. We reveal how these tools from robust control allow to design accelerated gradient descent algorithms with optimal guaranteed convergence rates by solving small-sized convex semi-definite programs. It is shown that this extends to the design of extremum controllers, with the goal to regulate the output of a general linear closed-loop system to the minimum of an objective function. Numerical experiments illustrate that we can not only recover gradient decent and the triple momentum variant of Nesterov's accelerated first order algorithm, but also automatically synthesize optimal algorithms even if the gradient information is passed through non-trivial dynamics, such as time-delays.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.