Emergent Mind

An iterative multi-fidelity approach for model order reduction of multi-dimensional input parametric PDE systems

(2301.09483)
Published Jan 23, 2023 in math.NA , cs.LG , cs.NA , and math.AP

Abstract

We propose a parametric sampling strategy for the reduction of large-scale PDE systems with multidimensional input parametric spaces by leveraging models of different fidelity. The design of this methodology allows a user to adaptively sample points ad hoc from a discrete training set with no prior requirement of error estimators. It is achieved by exploiting low-fidelity models throughout the parametric space to sample points using an efficient sampling strategy, and at the sampled parametric points, high-fidelity models are evaluated to recover the reduced basis functions. The low-fidelity models are then adapted with the reduced order models ( ROMs) built by projection onto the subspace spanned by the recovered basis functions. The process continues until the low-fidelity model can represent the high-fidelity model adequately for all the parameters in the parametric space. Since the proposed methodology leverages the use of low-fidelity models to assimilate the solution database, it significantly reduces the computational cost in the offline stage. The highlight of this article is to present the construction of the initial low-fidelity model, and a sampling strategy based on the discrete empirical interpolation method (DEIM). We test this approach on a 2D steady-state heat conduction problem for two different input parameters and make a qualitative comparison with the classical greedy reduced basis method (RBM), and further test on a 9-dimensional parametric non-coercive elliptic problem and analyze the computational performance based on different tuning of greedy selection of points.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.