Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 65 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 97 tok/s Pro
Kimi K2 164 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Variational Bayesian Optimal Experimental Design with Normalizing Flows (2404.13056v2)

Published 8 Apr 2024 in cs.LG, cs.CE, stat.CO, stat.ME, and stat.ML

Abstract: Bayesian optimal experimental design (OED) seeks experiments that maximize the expected information gain (EIG) in model parameters. Directly estimating the EIG using nested Monte Carlo is computationally expensive and requires an explicit likelihood. Variational OED (vOED), in contrast, estimates a lower bound of the EIG without likelihood evaluations by approximating the posterior distributions with variational forms, and then tightens the bound by optimizing its variational parameters. We introduce the use of normalizing flows (NFs) for representing variational distributions in vOED; we call this approach vOED-NFs. Specifically, we adopt NFs with a conditional invertible neural network architecture built from compositions of coupling layers, and enhanced with a summary network for data dimension reduction. We present Monte Carlo estimators to the lower bound along with gradient expressions to enable a gradient-based simultaneous optimization of the variational parameters and the design variables. The vOED-NFs algorithm is then validated in two benchmark problems, and demonstrated on a partial differential equation-governed application of cathodic electrophoretic deposition and an implicit likelihood case with stochastic modeling of aphid population. The findings suggest that a composition of 4--5 coupling layers is able to achieve lower EIG estimation bias, under a fixed budget of forward model runs, compared to previous approaches. The resulting NFs produce approximate posteriors that agree well with the true posteriors, able to capture non-Gaussian and multi-modal features effectively.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (28)
  1. doi:10.1214/ss/1177009939.
  2. doi:10.1093/oso/9780199296590.001.0001.
  3. doi:10.1111/insr.12107.
  4. doi:10.1214/23-STS915.
  5. doi:10.1214/aoms/1177728069.
  6. doi:10.1198/1061860032012.
  7. doi:10.1016/j.jcp.2012.08.013.
  8. doi:10.1615/Int.J.UncertaintyQuantification.2014006730.
  9. doi:10.1016/j.cma.2013.02.017.
  10. doi:10.1016/j.jprocont.2017.03.011.
  11. doi:10.1137/21M1466499.
  12. doi:10.1016/j.cma.2018.01.053.
  13. doi:10.1109/TIT.2010.2068870.
  14. doi:10.1109/TPAMI.2020.2992934.
  15. doi:10.1007/978-3-319-11259-6_23-1.
  16. doi:10.1016/j.jcp.2012.07.022.
  17. doi:10.1007/s10208-021-09537-5.
  18. doi:10.1609/aaai.v35i9.16997.
  19. doi:10.1109/TNNLS.2020.3042395.
  20. doi:10.1109/ICASSP.2019.8683143.
  21. doi:10.1016/j.jcp.2021.110194.
  22. doi:10.1007/978-1-4757-3437-9.
  23. doi:10.1162/089976600300015015.
  24. doi:10.1016/j.mbs.2005.07.009.
  25. doi:10.1111/j.1467-9876.2009.00696.x.
  26. doi:10.1109/7.705889.
  27. doi:10.1007/s11222-011-9288-2.
  28. doi:10.1016/j.cma.2023.116304.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.