IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/2003693.html
   My bibliography  Save this article

Reproducibility of preclinical animal research improves with heterogeneity of study samples

Author

Listed:
  • Bernhard Voelkl
  • Lucile Vogt
  • Emily S Sena
  • Hanno Würbel

Abstract

Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research.Author summary: Preclinical animal research is mostly based on studies conducted in a single laboratory and under highly standardized conditions. This entails the risk that the study results may only be valid under the specific conditions of the test laboratory, which may explain the poor reproducibility of preclinical animal research. To test this hypothesis, we used simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer and compared the reproducibility of results between single-laboratory and multi-laboratory studies. To simulate multi-laboratory studies, we combined data from multiple studies, as if several collaborating laboratories had conducted them in parallel. We found that single-laboratory studies produced large variation between study results. By contrast, multi-laboratory studies including as few as 2 to 4 laboratories produced much more consistent results, thereby increasing reproducibility without a need for larger sample sizes. Our findings demonstrate that excessive standardization is a source of poor reproducibility because it ignores biologically meaningful variation. We conclude that multi-laboratory studies—and potentially other ways of creating more heterogeneous study samples—provide an effective means of improving the reproducibility of study results, which is crucial to prevent wasting animals and resources for inconclusive research.

Suggested Citation

  • Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
  • Handle: RePEc:plo:pbio00:2003693
    DOI: 10.1371/journal.pbio.2003693
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2003693
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.2003693&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.2003693?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Emily S Sena & H Bart van der Worp & Philip M W Bath & David W Howells & Malcolm R Macleod, 2010. "Publication Bias in Reports of Animal Stroke Studies Leads to Major Overstatement of Efficacy," PLOS Biology, Public Library of Science, vol. 8(3), pages 1-8, March.
    3. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    4. Viechtbauer, Wolfgang, 2010. "Conducting Meta-Analyses in R with the metafor Package," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 36(i03).
    5. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    6. Marcus R. Munafò & Brian A. Nosek & Dorothy V. M. Bishop & Katherine S. Button & Christopher D. Chambers & Nathalie Percie du Sert & Uri Simonsohn & Eric-Jan Wagenmakers & Jennifer J. Ware & John P. A, 2017. "A manifesto for reproducible science," Nature Human Behaviour, Nature, vol. 1(1), pages 1-9, January.
    7. Natasha A Karp & Anneliese O Speak & Jacqueline K White & David J Adams & Martin Hrabé de Angelis & Yann Hérault & Richard F Mott, 2014. "Impact of Temporal Variation on Design and Analysis of Mouse Knockout Phenotyping Studies," PLOS ONE, Public Library of Science, vol. 9(10), pages 1-10, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Takuji Usui & Malcolm R Macleod & Sarah K McCann & Alistair M Senior & Shinichi Nakagawa, 2021. "Meta-analysis of variation suggests that embracing variability improves both replicability and generalizability in preclinical research," PLOS Biology, Public Library of Science, vol. 19(5), pages 1-20, May.
    3. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    4. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    3. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    4. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.
    5. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    6. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    7. Piers Steel & Sjoerd Beugelsdijk & Herman Aguinis, 2021. "The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 52(1), pages 23-44, February.
    8. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    9. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    10. Hajko, Vladimír, 2017. "The failure of Energy-Economy Nexus: A meta-analysis of 104 studies," Energy, Elsevier, vol. 125(C), pages 771-787.
    11. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    12. Adam Altmejd & Anna Dreber & Eskil Forsell & Juergen Huber & Taisuke Imai & Magnus Johannesson & Michael Kirchler & Gideon Nave & Colin Camerer, 2019. "Predicting the replicability of social science lab experiments," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-18, December.
    13. Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
    14. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    15. Constance Holman & Sophie K Piper & Ulrike Grittner & Andreas Antonios Diamantaras & Jonathan Kimmelman & Bob Siegerink & Ulrich Dirnagl, 2016. "Where Have All the Rodents Gone? The Effects of Attrition in Experimental Research on Cancer and Stroke," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-12, January.
    16. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    17. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    18. Matthias Steinfath & Silvia Vogl & Norman Violet & Franziska Schwarz & Hans Mielke & Thomas Selhorst & Matthias Greiner & Gilbert Schönfelder, 2018. "Simple changes of individual studies can improve the reproducibility of the biomedical scientific process as a whole," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-20, September.
    19. Fecher, Benedikt & Fräßdorf, Mathis & Hebing, Marcel & Wagner, Gert G., 2017. "Replikationen, Reputation und gute wissenschaftliche Praxis," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 68(2-3), pages 154-158.
    20. Koessler, Ann-Kathrin & Page, Lionel & Dulleck, Uwe, 2015. "Promoting pro-social behavior with public statements of good intent," MPRA Paper 80072, University Library of Munich, Germany, revised 24 May 2017.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:2003693. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.