Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 47 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 12 tok/s Pro
GPT-4o 64 tok/s Pro
Kimi K2 160 tok/s Pro
GPT OSS 120B 452 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Error bounds for maxout neural network approximations of model predictive control (2304.08779v2)

Published 18 Apr 2023 in eess.SY, cs.SY, and math.OC

Abstract: Neural network (NN) approximations of model predictive control (MPC) are a versatile approach if the online solution of the underlying optimal control problem (OCP) is too demanding and if an exact computation of the explicit MPC law is intractable. The drawback of such approximations is that they typically do not preserve stability and performance guarantees of the original MPC. However, such guarantees can be recovered if the maximum error with respect to the optimal control law and the Lipschitz constant of that error are known. We show in this work how to compute both values exactly when the control law is approximated by a maxout NN. We build upon related results for ReLU NN approximations and derive mixed-integer (MI) linear constraints that allow a computation of the output and the local gain of a maxout NN by solving an MI feasibility problem. Furthermore, we show theoretically and experimentally that maxout NN exist for which the maximum error is zero.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.