Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 159 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Multilevel Scalable Solvers for Stochastic Linear and Nonlinear Problems (2310.14649v1)

Published 23 Oct 2023 in cs.CE

Abstract: This article discusses the uncertainty quantification (UQ) for time-independent linear and nonlinear partial differential equation (PDE)-based systems with random model parameters carried out using sampling-free intrusive stochastic Galerkin method leveraging multilevel scalable solvers constructed combining two-grid Schwarz method and AMG. High-resolution spatial meshes along with a large number of stochastic expansion terms increase the system size leading to significant memory consumption and computational costs. Domain decomposition (DD)-based parallel scalable solvers are developed to this end for linear and nonlinear stochastic PDEs. A generalized minimum residual (GMRES) iterative solver equipped with a multilevel preconditioner consisting of restricted additive Schwarz (RAS) for the fine grid and algebraic multigrid (AMG) for the coarse grid is constructed to improve scalability. Numerical experiments illustrate the scalabilities of the proposed solver for stochastic linear and nonlinear Poisson problems.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.