Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 138 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 189 tok/s Pro
GPT OSS 120B 450 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Efficient Uncertainty Quantification for Dynamic Subsurface Flow with Surrogate by Theory-guided Neural Network (2004.13560v1)

Published 25 Apr 2020 in eess.SP, cs.LG, physics.comp-ph, and stat.ML

Abstract: Subsurface flow problems usually involve some degree of uncertainty. Consequently, uncertainty quantification is commonly necessary for subsurface flow prediction. In this work, we propose a methodology for efficient uncertainty quantification for dynamic subsurface flow with a surrogate constructed by the Theory-guided Neural Network (TgNN). The TgNN here is specially designed for problems with stochastic parameters. In the TgNN, stochastic parameters, time and location comprise the input of the neural network, while the quantity of interest is the output. The neural network is trained with available simulation data, while being simultaneously guided by theory (e.g., the governing equation, boundary conditions, initial conditions, etc.) of the underlying problem. The trained neural network can predict solutions of subsurface flow problems with new stochastic parameters. With the TgNN surrogate, the Monte Carlo (MC) method can be efficiently implemented for uncertainty quantification. The proposed methodology is evaluated with two-dimensional dynamic saturated flow problems in porous medium. Numerical results show that the TgNN based surrogate can significantly improve the efficiency of uncertainty quantification tasks compared with simulation based implementation. Further investigations regarding stochastic fields with smaller correlation length, larger variance, changing boundary values and out-of-distribution variances are performed, and satisfactory results are obtained.

Citations (55)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube