IDEAS home Printed from https://ideas.repec.org/a/eee/reensy/v96y2011i9p1220-1231.html
   My bibliography  Save this article

The dangers of sparse sampling for the quantification of margin and uncertainty

Author

Listed:
  • Hemez, François M.
  • Atamturktur, Sezer

Abstract

Activities such as global sensitivity analysis, statistical effect screening, uncertainty propagation, or model calibration have become integral to the Verification and Validation (V&V) of numerical models and computer simulations. One of the goals of V&V is to assess prediction accuracy and uncertainty, which feeds directly into reliability analysis or the Quantification of Margin and Uncertainty (QMU) of engineered systems. Because these analyses involve multiple runs of a computer code, they can rapidly become computationally expensive. An alternative to Monte Carlo-like sampling is to combine a design of computer experiments to meta-modeling, and replace the potentially expensive computer simulation by a fast-running emulator. The surrogate can then be used to estimate sensitivities, propagate uncertainty, and calibrate model parameters at a fraction of the cost it would take to wrap a sampling algorithm or optimization solver around the physics-based code. Doing so, however, offers the risk to develop an incorrect emulator that erroneously approximates the “true-but-unknown†sensitivities of the physics-based code. We demonstrate the extent to which this occurs when Gaussian Process Modeling (GPM) emulators are trained in high-dimensional spaces using too-sparsely populated designs-of-experiments. Our illustration analyzes a variant of the Rosenbrock function in which several effects are made statistically insignificant while others are strongly coupled, therefore, mimicking a situation that is often encountered in practice. In this example, using a combination of GPM emulator and design-of-experiments leads to an incorrect approximation of the function. A mathematical proof of the origin of the problem is proposed. The adverse effects that too-sparsely populated designs may produce are discussed for the coverage of the design space, estimation of sensitivities, and calibration of parameters. This work attempts to raise awareness to the potential dangers of not allocating enough resources when exploring a design space to develop fast-running emulators.

Suggested Citation

  • Hemez, François M. & Atamturktur, Sezer, 2011. "The dangers of sparse sampling for the quantification of margin and uncertainty," Reliability Engineering and System Safety, Elsevier, vol. 96(9), pages 1220-1231.
  • Handle: RePEc:eee:reensy:v:96:y:2011:i:9:p:1220-1231
    DOI: 10.1016/j.ress.2011.02.015
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0951832011000731
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.ress.2011.02.015?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jeremy E. Oakley & Anthony O'Hagan, 2004. "Probabilistic sensitivity analysis of complex models: a Bayesian approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 66(3), pages 751-769, August.
    2. Higdon, Dave & Gattiker, James & Williams, Brian & Rightley, Maria, 2008. "Computer Model Calibration Using High-Dimensional Output," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 570-583, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Liu, Xing & Ferrario, Elisa & Zio, Enrico, 2019. "Identifying resilient-important elements in interdependent critical infrastructures by sensitivity analysis," Reliability Engineering and System Safety, Elsevier, vol. 189(C), pages 423-434.
    2. Helton, Jon C. & Brooks, Dusty M. & Sallaberry, Cédric J., 2020. "Property values associated with the failure of individual links in a system with multiple weak and strong links," Reliability Engineering and System Safety, Elsevier, vol. 195(C).
    3. Helton, Jon C. & Brooks, Dusty M. & Sallaberry, Cédric J., 2020. "Margins associated with loss of assured safety for systems with multiple weak links and strong links," Reliability Engineering and System Safety, Elsevier, vol. 195(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ioannis Andrianakis & Ian R Vernon & Nicky McCreesh & Trevelyan J McKinley & Jeremy E Oakley & Rebecca N Nsubuga & Michael Goldstein & Richard G White, 2015. "Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda," PLOS Computational Biology, Public Library of Science, vol. 11(1), pages 1-18, January.
    2. Manfren, Massimiliano & Aste, Niccolò & Moshksar, Reza, 2013. "Calibration and uncertainty analysis for computer models – A meta-model based approach for integrated building energy simulation," Applied Energy, Elsevier, vol. 103(C), pages 627-641.
    3. Geoffrey Fairchild & Kyle S. Hickmann & Susan M. Mniszewski & Sara Y. Del Valle & James M. Hyman, 2014. "Optimizing human activity patterns using global sensitivity analysis," Computational and Mathematical Organization Theory, Springer, vol. 20(4), pages 394-416, December.
    4. Daniel W. Gladish & Daniel E. Pagendam & Luk J. M. Peeters & Petra M. Kuhnert & Jai Vaze, 2018. "Emulation Engines: Choice and Quantification of Uncertainty for Complex Hydrological Models," Journal of Agricultural, Biological and Environmental Statistics, Springer;The International Biometric Society;American Statistical Association, vol. 23(1), pages 39-62, March.
    5. K. Sham Bhat & David S. Mebane & Priyadarshi Mahapatra & Curtis B. Storlie, 2017. "Upscaling Uncertainty with Dynamic Discrepancy for a Multi-Scale Carbon Capture System," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(520), pages 1453-1467, October.
    6. Daniel W. Gladish & Ross Darnell & Peter J. Thorburn & Bhakti Haldankar, 2019. "Emulated Multivariate Global Sensitivity Analysis for Complex Computer Models Applied to Agricultural Simulators," Journal of Agricultural, Biological and Environmental Statistics, Springer;The International Biometric Society;American Statistical Association, vol. 24(1), pages 130-153, March.
    7. Curtis B. Storlie & William A. Lane & Emily M. Ryan & James R. Gattiker & David M. Higdon, 2015. "Calibration of Computational Models With Categorical Parameters and Correlated Outputs via Bayesian Smoothing Spline ANOVA," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 68-82, March.
    8. S. Cucurachi & E. Borgonovo & R. Heijungs, 2016. "A Protocol for the Global Sensitivity Analysis of Impact Assessment Models in Life Cycle Assessment," Risk Analysis, John Wiley & Sons, vol. 36(2), pages 357-377, February.
    9. Jakub Bijak & Jason D. Hilton & Eric Silverman & Viet Dung Cao, 2013. "Reforging the Wedding Ring," Demographic Research, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 29(27), pages 729-766.
    10. Acharki, Naoufal & Bertoncello, Antoine & Garnier, Josselin, 2023. "Robust prediction interval estimation for Gaussian processes by cross-validation method," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
    11. Xueping Chen & Yujie Gai & Xiaodi Wang, 2023. "A-optimal designs for non-parametric symmetrical global sensitivity analysis," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 86(2), pages 219-237, February.
    12. Matieyendou Lamboni, 2020. "Uncertainty quantification: a minimum variance unbiased (joint) estimator of the non-normalized Sobol’ indices," Statistical Papers, Springer, vol. 61(5), pages 1939-1970, October.
    13. Isaac Corro Ramos & Maureen P. M. H. Rutten-van Mölken & Maiwenn J. Al, 2013. "The Role of Value-of-Information Analysis in a Health Care Research Priority Setting," Medical Decision Making, , vol. 33(4), pages 472-489, May.
    14. Veiga, Sébastien Da & Marrel, Amandine, 2020. "Gaussian process regression with linear inequality constraints," Reliability Engineering and System Safety, Elsevier, vol. 195(C).
    15. Petropoulos, G. & Wooster, M.J. & Carlson, T.N. & Kennedy, M.C. & Scholze, M., 2009. "A global Bayesian sensitivity analysis of the 1d SimSphere soil–vegetation–atmospheric transfer (SVAT) model using Gaussian model emulation," Ecological Modelling, Elsevier, vol. 220(19), pages 2427-2440.
    16. Lu, Xuefei & Borgonovo, Emanuele, 2023. "Global sensitivity analysis in epidemiological modeling," European Journal of Operational Research, Elsevier, vol. 304(1), pages 9-24.
    17. Tianyang Wang & James S. Dyer & Warren J. Hahn, 2017. "Sensitivity analysis of decision making under dependent uncertainties using copulas," EURO Journal on Decision Processes, Springer;EURO - The Association of European Operational Research Societies, vol. 5(1), pages 117-139, November.
    18. Drignei, Dorin, 2011. "A general statistical model for computer experiments with time series output," Reliability Engineering and System Safety, Elsevier, vol. 96(4), pages 460-467.
    19. Hwang, Youngdeok & Kim, Hang J. & Chang, Won & Yeo, Kyongmin & Kim, Yongku, 2019. "Bayesian pollution source identification via an inverse physics model," Computational Statistics & Data Analysis, Elsevier, vol. 134(C), pages 76-92.
    20. Al Ali, Hannah & Daneshkhah, Alireza & Boutayeb, Abdesslam & Malunguza, Noble Jahalamajaha & Mukandavire, Zindoga, 2022. "Exploring dynamical properties of a Type 1 diabetes model using sensitivity approaches," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 201(C), pages 324-342.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:reensy:v:96:y:2011:i:9:p:1220-1231. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/reliability-engineering-and-system-safety .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.