IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1005227.html
   My bibliography  Save this article

The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems

Author

Listed:
  • Andrew White
  • Malachi Tolman
  • Howard D Thames
  • Hubert Rodney Withers
  • Kathy A Mason
  • Mark K Transtrum

Abstract

We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model’s discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system–a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model.Author Summary: Sloppy models are often unidentifiable, i.e., characterized by many parameters that are poorly constrained by experimental data. Many models of complex biological systems are sloppy, which has prompted considerable debate about the identifiability of parameters and methods of selecting optimal experiments to infer parameter values. We explore how the approximate nature of models affects the prospect for accurate parameter estimates and model predictivity in sloppy models when using optimal experimental design. We find that sloppy models may no longer give a good fit to data generated from “optimal” experiments. In this case, the model has much less predictive power than it did before optimal experimental selection. We use a simple hyper-model of model error to quantify the model’s discrepancy from the physical system and discuss the potential limits of accurate parameter estimation in sloppy systems.

Suggested Citation

  • Andrew White & Malachi Tolman & Howard D Thames & Hubert Rodney Withers & Kathy A Mason & Mark K Transtrum, 2016. "The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems," PLOS Computational Biology, Public Library of Science, vol. 12(12), pages 1-26, December.
  • Handle: RePEc:plo:pcbi00:1005227
    DOI: 10.1371/journal.pcbi.1005227
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1005227
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1005227&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1005227?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Juliane Liepe & Sarah Filippi & Michał Komorowski & Michael P H Stumpf, 2013. "Maximizing the Information Content of Experiments in Systems Biology," PLOS Computational Biology, Public Library of Science, vol. 9(1), pages 1-13, January.
    2. Joshua F Apgar & Jared E Toettcher & Drew Endy & Forest M White & Bruce Tidor, 2008. "Stimulus Design for Model Selection and Validation in Cell Signaling," PLOS Computational Biology, Public Library of Science, vol. 4(2), pages 1-10, February.
    3. Ryan N Gutenkunst & Joshua J Waterfall & Fergal P Casey & Kevin S Brown & Christopher R Myers & James P Sethna, 2007. "Universally Sloppy Parameter Sensitivities in Systems Biology Models," PLOS Computational Biology, Public Library of Science, vol. 3(10), pages 1-8, October.
    4. Marc C. Kennedy & Anthony O'Hagan, 2001. "Bayesian calibration of computer models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 63(3), pages 425-464.
    5. Thembi Mdluli & Gregery T Buzzard & Ann E Rundell, 2015. "Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty," PLOS Computational Biology, Public Library of Science, vol. 11(9), pages 1-23, September.
    6. Mark K Transtrum & Peng Qiu, 2016. "Bridging Mechanistic and Phenomenological Models of Complex Biological Systems," PLOS Computational Biology, Public Library of Science, vol. 12(5), pages 1-34, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lam, Nicholas N. & Docherty, Paul D. & Murray, Rua, 2022. "Practical identifiability of parametrised models: A review of benefits and limitations of various approaches," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 199(C), pages 202-216.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Samuel Bandara & Johannes P Schlöder & Roland Eils & Hans Georg Bock & Tobias Meyer, 2009. "Optimal Experimental Design for Parameter Estimation of a Cell Signaling Model," PLOS Computational Biology, Public Library of Science, vol. 5(11), pages 1-12, November.
    2. Agus Hartoyo & Peter J Cadusch & David T J Liley & Damien G Hicks, 2019. "Parameter estimation and identifiability in a neural population model for electro-cortical activity," PLOS Computational Biology, Public Library of Science, vol. 15(5), pages 1-27, May.
    3. Juliane Liepe & Sarah Filippi & Michał Komorowski & Michael P H Stumpf, 2013. "Maximizing the Information Content of Experiments in Systems Biology," PLOS Computational Biology, Public Library of Science, vol. 9(1), pages 1-13, January.
    4. Thembi Mdluli & Gregery T Buzzard & Ann E Rundell, 2015. "Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty," PLOS Computational Biology, Public Library of Science, vol. 11(9), pages 1-23, September.
    5. Mark K Transtrum & Peng Qiu, 2016. "Bridging Mechanistic and Phenomenological Models of Complex Biological Systems," PLOS Computational Biology, Public Library of Science, vol. 12(5), pages 1-34, May.
    6. Vanslette, Kevin & Tohme, Tony & Youcef-Toumi, Kamal, 2020. "A general model validation and testing tool," Reliability Engineering and System Safety, Elsevier, vol. 195(C).
    7. Federico Sevlever & Juan Pablo Di Bella & Alejandra C Ventura, 2020. "Discriminating between negative cooperativity and ligand binding to independent sites using pre-equilibrium properties of binding curves," PLOS Computational Biology, Public Library of Science, vol. 16(6), pages 1-21, June.
    8. Matthias Katzfuss & Joseph Guinness & Wenlong Gong & Daniel Zilber, 2020. "Vecchia Approximations of Gaussian-Process Predictions," Journal of Agricultural, Biological and Environmental Statistics, Springer;The International Biometric Society;American Statistical Association, vol. 25(3), pages 383-414, September.
    9. Jakub Bijak & Jason D. Hilton & Eric Silverman & Viet Dung Cao, 2013. "Reforging the Wedding Ring," Demographic Research, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 29(27), pages 729-766.
    10. Hao Wu & Michael Browne, 2015. "Random Model Discrepancy: Interpretations and Technicalities (A Rejoinder)," Psychometrika, Springer;The Psychometric Society, vol. 80(3), pages 619-624, September.
    11. Villez, Kris & Del Giudice, Dario & Neumann, Marc B. & Rieckermann, Jörg, 2020. "Accounting for erroneous model structures in biokinetic process models," Reliability Engineering and System Safety, Elsevier, vol. 203(C).
    12. Xiaoyu Xiong & Benjamin D. Youngman & Theodoros Economou, 2021. "Data fusion with Gaussian processes for estimation of environmental hazard events," Environmetrics, John Wiley & Sons, Ltd., vol. 32(3), May.
    13. Petropoulos, G. & Wooster, M.J. & Carlson, T.N. & Kennedy, M.C. & Scholze, M., 2009. "A global Bayesian sensitivity analysis of the 1d SimSphere soil–vegetation–atmospheric transfer (SVAT) model using Gaussian model emulation," Ecological Modelling, Elsevier, vol. 220(19), pages 2427-2440.
    14. David Breitenmoser & Francesco Cerutti & Gernot Butterweck & Malgorzata Magdalena Kasprzak & Sabine Mayer, 2023. "Emulator-based Bayesian inference on non-proportional scintillation models by compton-edge probing," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    15. Drignei, Dorin, 2011. "A general statistical model for computer experiments with time series output," Reliability Engineering and System Safety, Elsevier, vol. 96(4), pages 460-467.
    16. Yuan, Jun & Nian, Victor & Su, Bin & Meng, Qun, 2017. "A simultaneous calibration and parameter ranking method for building energy models," Applied Energy, Elsevier, vol. 206(C), pages 657-666.
    17. Gross, Eitan, 2015. "Effect of environmental stress on regulation of gene expression in the yeast," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 430(C), pages 224-235.
    18. Hwang, Youngdeok & Kim, Hang J. & Chang, Won & Yeo, Kyongmin & Kim, Yongku, 2019. "Bayesian pollution source identification via an inverse physics model," Computational Statistics & Data Analysis, Elsevier, vol. 134(C), pages 76-92.
    19. Choi, Wonjun & Menberg, Kathrin & Kikumoto, Hideki & Heo, Yeonsook & Choudhary, Ruchi & Ooka, Ryozo, 2018. "Bayesian inference of structural error in inverse models of thermal response tests," Applied Energy, Elsevier, vol. 228(C), pages 1473-1485.
    20. Yuan, Jun & Ng, Szu Hui, 2013. "A sequential approach for stochastic computer model calibration and prediction," Reliability Engineering and System Safety, Elsevier, vol. 111(C), pages 273-286.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1005227. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.