Progressive reduced order modeling: empowering data-driven modeling with selective knowledge transfer (2310.03770v1)
Abstract: Data-driven modeling can suffer from a constant demand for data, leading to reduced accuracy and impractical for engineering applications due to the high cost and scarcity of information. To address this challenge, we propose a progressive reduced order modeling framework that minimizes data cravings and enhances data-driven modeling's practicality. Our approach selectively transfers knowledge from previously trained models through gates, similar to how humans selectively use valuable knowledge while ignoring unuseful information. By filtering relevant information from previous models, we can create a surrogate model with minimal turnaround time and a smaller training set that can still achieve high accuracy. We have tested our framework in several cases, including transport in porous media, gravity-driven flow, and finite deformation in hyperelastic materials. Our results illustrate that retaining information from previous models and utilizing a valuable portion of that knowledge can significantly improve the accuracy of the current model. We have demonstrated the importance of progressive knowledge transfer and its impact on model accuracy with reduced training samples. For instance, our framework with four parent models outperforms the no-parent counterpart trained on data nine times larger. Our research unlocks data-driven modeling's potential for practical engineering applications by mitigating the data scarcity issue. Our proposed framework is a significant step toward more efficient and cost-effective data-driven modeling, fostering advancements across various fields.
- Cop27 climate talks: what succeeded, what failed and what’s next. \JournalTitleNature 612, 16–17 (2022).
- Falk, J. et al. An urgent need for cop27: confronting converging crises. \JournalTitleSustainability Science 1–5 (2022).
- Multiple-point geostatistics for modeling subsurface heterogeneity: A comprehensive review. \JournalTitleWater Resources Research 44 (2008).
- Hartmann, A. Putting the cat in the box: why our models should consider subsurface heterogeneity at all scales. \JournalTitleWiley Interdisciplinary Reviews: Water 3, 478–486 (2016).
- Ginn, T. et al. Processes in microbial transport in the natural subsurface. \JournalTitleAdvances in Water Resources 25, 1017–1042 (2002).
- Colloid transport in the subsurface: Past, present, and future challenges. \JournalTitleVadose Zone Journal 3, 326–337 (2004).
- Numerical methods for partial differential equations (Springer Science & Business Media, 2012).
- Certified reduced basis methods for parametrized partial differential equations (Springer, 2016).
- Chen, F. et al. Capacity assessment and cost analysis of geologic storage of hydrogen: A case study in intermountain-west region usa. \JournalTitleInternational Journal of Hydrogen Energy 48, 9008–9022 (2023).
- Wen, G. et al. Real-time high-resolution co 2 geological storage prediction using nested fourier neural operators. \JournalTitleEnergy & Environmental Science 16, 1732–1741 (2023).
- The impact of heterogeneity on the distribution of co2: Numerical simulation of co2 storage at ketzin. \JournalTitleInternational Journal of Greenhouse Gas Control 4, 1016–1025 (2010).
- Estimation and uncertainty analysis of the co2 storage volume in the sleipner field via 4d reversible-jump markov-chain monte carlo. \JournalTitleJournal of Petroleum Science and Engineering 200, 108333 (2021).
- Sns: a solution-based nonlinear subspace method for time-dependent model order reduction. \JournalTitleSIAM Journal on Scientific Computing 42, A1116–A1146 (2020).
- Data-driven physics-based digital twins via a library of component-based reduced-order models. \JournalTitleInternational Journal for Numerical Methods in Engineering 123, 2986–3003 (2022).
- Data assimilation predictive gan (da-predgan) applied to a spatio-temporal compartmental model in epidemiology. \JournalTitleJournal of Scientific Computing 94, 1–31 (2023).
- Deep learning of parameterized equations with applications to uncertainty quantification. \JournalTitleInternational Journal for Uncertainty Quantification 11 (2021).
- Physics constrained learning for data-driven inverse modeling from sparse observations. \JournalTitleJournal of Computational Physics 453, 110938 (2022).
- Physics-informed machine learning with differentiable programming for heterogeneous underground reservoir pressure management. \JournalTitleScientific Reports 12, 1–12 (2022).
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. \JournalTitleJournal of Computational Physics 378, 686–707 (2019).
- A survey of transfer learning. \JournalTitleJournal of Big data 3, 1–40 (2016).
- Zhuang, F. et al. A comprehensive survey on transfer learning. \JournalTitleProceedings of the IEEE 109, 43–76 (2020).
- Rusu, A. et al. Progressive neural networks. \JournalTitlearXiv preprint arXiv:1606.04671 (2016).
- Kadeethum, T. et al. A framework for data-driven solution and parameter estimation of pdes using conditional generative adversarial networks. \JournalTitleNature Computational Science 1, 819–829, DOI: https://doi.org/10.1038/s43588-021-00171-3 (2021).
- Kadeethum, T. et al. Reduced order modeling for flow and transport problems with barlow twins self-supervised learning. \JournalTitleScientific Reports 12, 1–18 (2022).
- Epistemic uncertainty-aware barlow twins reduced order modeling for nonlinear contact problems. \JournalTitleunder review (2022).
- Deep transfer operator learning for partial differential equations under conditional shift. \JournalTitleNature Machine Intelligence 4, 1155–1164 (2022).
- Gate recurrent unit network based on hilbert-schmidt independence criterion for state-of-health estimation. \JournalTitlearXiv preprint arXiv:2303.09497 (2023).
- Vaswani, A. et al. Attention is all you need. \JournalTitleAdvances in neural information processing systems 30 (2017).
- Gated information bottleneck for generalization in sequential environments. \JournalTitleKnowledge and Information Systems 65, 683–705 (2023).
- Drori, I. et al. A neural network solves, explains, and generates university math problems by program synthesis and few-shot learning at human level. \JournalTitleProceedings of the National Academy of Sciences 119, e2123433119 (2022).
- Kadeethum, T. et al. Non-intrusive reduced order modeling of natural convection in porous media using convolutional autoencoders: comparison with linear subspace techniques. \JournalTitleAdvances in Water Resources 104098 (2022).
- Kadeethum, T. et al. Enhancing high-fidelity nonlinear solver with reduced order model. \JournalTitleScientific Reports 12, 1–15 (2022).
- Learning nonlinear operators via deeponet based on the universal approximation theorem of operators. \JournalTitleNature Machine Intelligence 3, 218–229 (2021).
- Gradient-based constrained optimization using a database of linear reduced-order models. \JournalTitleJournal of Computational Physics 423 (2020).
- Learning two-phase microstructure evolution using neural operators and autoencoder architectures. \JournalTitlearXiv preprint arXiv:2204.07230 (2022).
- Mfnets: data efficient all-at-once learning of multifidelity surrogates as directed networks of information sources. \JournalTitleComputational Mechanics 68, 741–758 (2021).
- A locally conservative mixed finite element framework for coupled hydro-mechanical-chemical processes in heterogeneous porous media. \JournalTitleComputers & Geosciences 104774 (2021).
- On the contact problem of an inflated spherical hyperelastic membrane. \JournalTitleInternational Journal of Non-Linear Mechanics 57, 130–139 (2013).
- Topology optimization of hyperelastic structures with frictionless contact supports. \JournalTitleInternational Journal of Solids and Structures 81, 373–382 (2016).
- Solving hyperelastic material problems by asymptotic numerical method. \JournalTitleComputational mechanics 47, 77–92 (2011).
- Balay, S. et al. PETSc Users Manual. Tech. Rep. ANL-95/11 - Revision 3.10, Argonne National Laboratory (2018).
- Adam: A method for stochastic optimization. \JournalTitlearXiv preprint arXiv:1412.6980 (2014).
- Sgdr: Stochastic gradient descent with warm restarts. \JournalTitlearXiv preprint arXiv:1608.03983 (2016).
- Prechelt, L. Early stopping-but when? In Neural Networks: Tricks of the trade, 55–69 (Springer, 1998).
- Prechelt, L. Automatic early stopping using cross validation: quantifying the criteria. \JournalTitleNeural Networks 11, 761–767 (1998).
- Wright, G. Radial basis function interpolation: numerical and analytical developments (University of Colorado at Boulder, 2003).