IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v113y2017icp346-362.html
   My bibliography  Save this article

Designing combined physical and computer experiments to maximize prediction accuracy

Author

Listed:
  • Leatherman, Erin R.
  • Dean, Angela M.
  • Santner, Thomas J.

Abstract

Combined designs for experiments involving a physical system and a simulator of the physical system are evaluated in terms of their accuracy of predicting the mean of the physical system. Comparisons are made among designs that are (1) locally optimal under the minimum integrated mean squared prediction error criterion for the combined physical system and simulator experiments, (2) locally optimal for the physical or simulator experiments, with a fixed design for the component not being optimized, (3) maximin augmented nested Latin hypercube, and (4) I-optimal for the physical system experiment and maximin Latin hypercube for the simulator experiment. Computational methods are proposed for constructing the designs of interest. For a large test bed of examples, the empirical mean squared prediction errors are compared at a grid of inputs for each test surface using a statistically calibrated Bayesian predictor based on the data from each design. The prediction errors are also studied for a test bed that varies only the calibration parameter of the test surface. Design recommendations are given.

Suggested Citation

  • Leatherman, Erin R. & Dean, Angela M. & Santner, Thomas J., 2017. "Designing combined physical and computer experiments to maximize prediction accuracy," Computational Statistics & Data Analysis, Elsevier, vol. 113(C), pages 346-362.
  • Handle: RePEc:eee:csdana:v:113:y:2017:i:c:p:346-362
    DOI: 10.1016/j.csda.2016.07.013
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947316301773
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2016.07.013?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Williams, Brian J. & Loeppky, Jason L. & Moore, Leslie M. & Macklem, Mason S., 2011. "Batch sequential design to achieve predictive maturity with calibrated computer models," Reliability Engineering and System Safety, Elsevier, vol. 96(9), pages 1208-1219.
    2. Higdon, Dave & Gattiker, James & Williams, Brian & Rightley, Maria, 2008. "Computer Model Calibration Using High-Dimensional Output," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 570-583, June.
    3. Marc C. Kennedy & Anthony O'Hagan, 2001. "Bayesian calibration of computer models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 63(3), pages 425-464.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Huaimin Diao & Yan Wang & Dianpeng Wang, 2022. "A D-Optimal Sequential Calibration Design for Computer Models," Mathematics, MDPI, vol. 10(9), pages 1-15, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Drignei, Dorin, 2011. "A general statistical model for computer experiments with time series output," Reliability Engineering and System Safety, Elsevier, vol. 96(4), pages 460-467.
    2. Hwang, Youngdeok & Kim, Hang J. & Chang, Won & Yeo, Kyongmin & Kim, Yongku, 2019. "Bayesian pollution source identification via an inverse physics model," Computational Statistics & Data Analysis, Elsevier, vol. 134(C), pages 76-92.
    3. Yuan, Jun & Ng, Szu Hui, 2013. "A sequential approach for stochastic computer model calibration and prediction," Reliability Engineering and System Safety, Elsevier, vol. 111(C), pages 273-286.
    4. Ioannis Andrianakis & Ian R Vernon & Nicky McCreesh & Trevelyan J McKinley & Jeremy E Oakley & Rebecca N Nsubuga & Michael Goldstein & Richard G White, 2015. "Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda," PLOS Computational Biology, Public Library of Science, vol. 11(1), pages 1-18, January.
    5. Samantha M. Roth & Ben Seiyon Lee & Sanjib Sharma & Iman Hosseini‐Shakib & Klaus Keller & Murali Haran, 2023. "Flood hazard model calibration using multiresolution model output," Environmetrics, John Wiley & Sons, Ltd., vol. 34(2), March.
    6. Perrin, G., 2020. "Adaptive calibration of a computer code with time-series output," Reliability Engineering and System Safety, Elsevier, vol. 196(C).
    7. White, Staci A. & Herbei, Radu, 2015. "A Monte Carlo approach to quantifying model error in Bayesian parameter estimation," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 168-181.
    8. Mevin Hooten & Christopher Wikle & Michael Schwob, 2020. "Statistical Implementations of Agent‐Based Demographic Models," International Statistical Review, International Statistical Institute, vol. 88(2), pages 441-461, August.
    9. Wu, Xu & Kozlowski, Tomasz & Meidani, Hadi, 2018. "Kriging-based inverse uncertainty quantification of nuclear fuel performance code BISON fission gas release model using time series measurement data," Reliability Engineering and System Safety, Elsevier, vol. 169(C), pages 422-436.
    10. Nott, David J. & Marshall, Lucy & Fielding, Mark & Liong, Shie-Yui, 2014. "Mixtures of experts for understanding model discrepancy in dynamic computer models," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 491-505.
    11. Paulo, Rui & García-Donato, Gonzalo & Palomo, Jesús, 2012. "Calibration of computer models with multivariate output," Computational Statistics & Data Analysis, Elsevier, vol. 56(12), pages 3959-3974.
    12. Jackson Samuel E. & Vernon Ian & Liu Junli & Lindsey Keith, 2020. "Understanding hormonal crosstalk in Arabidopsis root development via emulation and history matching," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 19(2), pages 1-33, April.
    13. Guillaume Perrin & Christian Soize, 2020. "Adaptive method for indirect identification of the statistical properties of random fields in a Bayesian framework," Computational Statistics, Springer, vol. 35(1), pages 111-133, March.
    14. Stripling, H.F. & Adams, M.L. & McClarren, R.G. & Mallick, B.K., 2011. "The Method of Manufactured Universes for validating uncertainty quantification methods," Reliability Engineering and System Safety, Elsevier, vol. 96(9), pages 1242-1256.
    15. Matthew Plumlee, 2014. "Fast Prediction of Deterministic Functions Using Sparse Grid Experimental Designs," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(508), pages 1581-1591, December.
    16. SungKu Kang & Ran Jin & Xinwei Deng & Ron S. Kenett, 2023. "Challenges of modeling and analysis in cybermanufacturing: a review from a machine learning and computation perspective," Journal of Intelligent Manufacturing, Springer, vol. 34(2), pages 415-428, February.
    17. Manfren, Massimiliano & Aste, Niccolò & Moshksar, Reza, 2013. "Calibration and uncertainty analysis for computer models – A meta-model based approach for integrated building energy simulation," Applied Energy, Elsevier, vol. 103(C), pages 627-641.
    18. Daniel W. Gladish & Daniel E. Pagendam & Luk J. M. Peeters & Petra M. Kuhnert & Jai Vaze, 2018. "Emulation Engines: Choice and Quantification of Uncertainty for Complex Hydrological Models," Journal of Agricultural, Biological and Environmental Statistics, Springer;The International Biometric Society;American Statistical Association, vol. 23(1), pages 39-62, March.
    19. Chen, Yewen & Chang, Xiaohui & Luo, Fangzhi & Huang, Hui, 2023. "Additive dynamic models for correcting numerical model outputs," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    20. Kim, Wongon & Yoon, Heonjun & Lee, Guesuk & Kim, Taejin & Youn, Byeng D., 2020. "A new calibration metric that considers statistical correlation: Marginal Probability and Correlation Residuals," Reliability Engineering and System Safety, Elsevier, vol. 195(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:113:y:2017:i:c:p:346-362. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.