Emergent Mind

Abstract

In many machine learning applications, one wants to learn the unknown objective and constraint functions of an optimization problem from available data and then apply some technique to attain a local optimizer of the learned model. This work considers Gaussian processes as global surrogate models and utilizes them in conjunction with derivative-free trust-region methods. It is well known that derivative-free trust-region methods converge globallyprovided the surrogate model is probabilistically fully linear. We prove that \glspl{gp} are indeed probabilistically fully linear, thus resulting in fast (compared to linear or quadratic local surrogate models) and global convergence. We draw upon the optimization of a chemical reactor to demonstrate the efficiency of \gls{gp}-based trust-region methods.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.