Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 43 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 197 tok/s Pro
GPT OSS 120B 455 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Adaptive sparse interpolation for accelerating nonlinear stochastic reduced-order modeling with time-dependent bases (2207.10656v2)

Published 14 Jul 2022 in math.NA, cs.NA, and physics.flu-dyn

Abstract: Stochastic reduced-order modeling based on time-dependent bases (TDBs) has proven successful for extracting and exploiting low-dimensional manifold from stochastic partial differential equations (SPDEs). The nominal computational cost of solving a rank-$r$ reduced-order model (ROM) based on time-dependent basis, a.k.a. TDB-ROM, is roughly equal to that of solving the full-order model for $r$ random samples. As of now, this nominal performance can only be achieved for linear or quadratic SPDEs -- at the expense of a highly intrusive process. On the other hand, for problems with non-polynomial nonlinearity, the computational cost of solving the TDB evolution equations is the same as solving the full-order model. In this work, we present an adaptive sparse interpolation algorithm that enables stochastic TDB-ROMs to achieve nominal computational cost for generic nonlinear SPDEs. Our algorithm constructs a low-rank approximation for the right hand side of the SPDE using the discrete empirical interpolation method (DEIM). The presented algorithm does not require any offline computation and as a result the low-rank approximation can adapt to any transient changes of the dynamics on the fly. We also propose a rank-adaptive strategy to control the error of the sparse interpolation. Our algorithm achieves computational speedup by adaptive sampling of the state and random spaces. We illustrate the efficiency of our approach for two test cases: (1) one-dimensional stochastic Burgers' equation, and (2) two-dimensional compressible Navier-Stokes equations subject to one-hundred-dimensional random perturbations. In all cases, the presented algorithm results in orders of magnitude reduction in the computational cost.

Citations (15)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.