Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hermite-type modifications of BOBYQA for optimization with some partial derivatives (2204.05022v1)

Published 11 Apr 2022 in cs.CE

Abstract: In this work we propose two Hermite-type optimization methods, Hermite least squares and Hermite BOBYQA, specialized for the case that some partial derivatives of the objective function are available and others are not. The main objective is to reduce the number of objective function calls by maintaining the convergence properties. Both methods are modifications of Powell's derivative-free BOBYQA algorithm. But instead of (underdetermined) interpolation for building the quadratic subproblem in each iteration, the training data is enriched with first and -- if possible -- second order derivatives and then (weighted) least squares regression is used. Proofs for global convergence are discussed and numerical results are presented. Further, the applicability is verified for a realistic test case in the context of yield optimization. Numerical tests show that the Hermite least squares approach outperforms classic BOBYQA if half or more partial derivatives are available. In addition, the Hermite-type approaches achieve more robustness and thus better performance in case of noisy objective functions.

Summary

We haven't generated a summary for this paper yet.