IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/3001886.html
   My bibliography  Save this article

Systematic assessment of the replicability and generalizability of preclinical findings: Impact of protocol harmonization across laboratory sites

Author

Listed:
  • María Arroyo-Araujo
  • Bernhard Voelkl
  • Clément Laloux
  • Janja Novak
  • Bastijn Koopmans
  • Ann-Marie Waldron
  • Isabel Seiffert
  • Helen Stirling
  • Katharina Aulehner
  • Sanna K Janhunen
  • Sylvie Ramboz
  • Heidrun Potschka
  • Johanna Holappa
  • Tania Fine
  • Maarten Loos
  • Bruno Boulanger
  • Hanno Würbel
  • Martien J Kas

Abstract

The influence of protocol standardization between laboratories on their replicability of preclinical results has not been addressed in a systematic way. While standardization is considered good research practice as a means to control for undesired external noise (i.e., highly variable results), some reports suggest that standardized protocols may lead to idiosyncratic results, thus undermining replicability. Through the EQIPD consortium, a multi-lab collaboration between academic and industry partners, we aimed to elucidate parameters that impact the replicability of preclinical animal studies. To this end, 3 experimental protocols were implemented across 7 laboratories. The replicability of results was determined using the distance travelled in an open field after administration of pharmacological compounds known to modulate locomotor activity (MK-801, diazepam, and clozapine) in C57BL/6 mice as a worked example. The goal was to determine whether harmonization of study protocols across laboratories improves the replicability of the results and whether replicability can be further improved by systematic variation (heterogenization) of 2 environmental factors (time of testing and light intensity during testing) within laboratories. Protocols were tested in 3 consecutive stages and differed in the extent of harmonization across laboratories and standardization within laboratories: stage 1, minimally aligned across sites (local protocol); stage 2, fully aligned across sites (harmonized protocol) with and without systematic variation (standardized and heterogenized cohort); and stage 3, fully aligned across sites (standardized protocol) with a different compound. All protocols resulted in consistent treatment effects across laboratories, which were also replicated within laboratories across the different stages. Harmonization of protocols across laboratories reduced between-lab variability substantially compared to each lab using their local protocol. In contrast, the environmental factors chosen to introduce systematic variation within laboratories did not affect the behavioral outcome. Therefore, heterogenization did not reduce between-lab variability further compared to the harmonization of the standardized protocol. Altogether, these findings demonstrate that subtle variations between lab-specific study protocols may introduce variation across independent replicate studies even after protocol harmonization and that systematic heterogenization of environmental factors may not be sufficient to account for such between-lab variation. Differences in replicability of results within and between laboratories highlight the ubiquity of study-specific variation due to between-lab variability, the importance of transparent and fine-grained reporting of methodologies and research protocols, and the importance of independent study replication.The influence of protocol standardization between laboratories on their replicability of preclinical results has not been addressed in a systematic way. This study reveals differences in replicability of results within and between laboratories, highlighting the ubiquity of study-specific variation due to between-lab variability, the importance of transparent and fine-grained reporting of methodologies and research protocols, and the importance of independent study replication.

Suggested Citation

  • María Arroyo-Araujo & Bernhard Voelkl & Clément Laloux & Janja Novak & Bastijn Koopmans & Ann-Marie Waldron & Isabel Seiffert & Helen Stirling & Katharina Aulehner & Sanna K Janhunen & Sylvie Ramboz &, 2022. "Systematic assessment of the replicability and generalizability of preclinical findings: Impact of protocol harmonization across laboratory sites," PLOS Biology, Public Library of Science, vol. 20(11), pages 1-19, November.
  • Handle: RePEc:plo:pbio00:3001886
    DOI: 10.1371/journal.pbio.3001886
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3001886
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.3001886&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.3001886?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    2. Dorothy Bishop, 2019. "Rein in the four horsemen of irreproducibility," Nature, Nature, vol. 568(7753), pages 435-435, April.
    3. Takuji Usui & Malcolm R Macleod & Sarah K McCann & Alistair M Senior & Shinichi Nakagawa, 2021. "Meta-analysis of variation suggests that embracing variability improves both replicability and generalizability in preclinical research," PLOS Biology, Public Library of Science, vol. 19(5), pages 1-20, May.
    4. Jim Giles, 2006. "Animal experiments under fire for poor design," Nature, Nature, vol. 444(7122), pages 981-981, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:osf:metaar:yxba5_v1 is not listed on IDEAS
    2. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
    3. repec:osf:metaar:s4b65_v1 is not listed on IDEAS
    4. Dario Krpan & Jonathan E. Booth & Andreea Damien, 2023. "The positive–negative–competence (PNC) model of psychological responses to representations of robots," Nature Human Behaviour, Nature, vol. 7(11), pages 1933-1954, November.
    5. Dette, Holger & Pepelyshev, Andrey & Wong, Weng Kee, 2008. "Optimal designs for dose finding experiments in toxicity studies," Technical Reports 2008,09, Technische Universität Dortmund, Sonderforschungsbereich 475: Komplexitätsreduktion in multivariaten Datenstrukturen.
    6. Christoph Huber & Anna Dreber & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Utz Weitzel & Miguel Abellán & Xeniya Adayeva & Fehime Ceren Ay & Kai Barron & Zachariah Berry & Werner Bönte , 2023. "Competition and moral behavior: A meta-analysis of forty-five crowd-sourced experimental designs," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 120(23), pages 2215572120-, June.
    7. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    8. Laura Bowering Mullen, 2024. "Open Access, Scholarly Communication, and Open Science in Psychology: An Overview for Researchers," SAGE Open, , vol. 14(1_suppl), pages 21582440231, April.
    9. Takuji Usui & Malcolm R Macleod & Sarah K McCann & Alistair M Senior & Shinichi Nakagawa, 2021. "Meta-analysis of variation suggests that embracing variability improves both replicability and generalizability in preclinical research," PLOS Biology, Public Library of Science, vol. 19(5), pages 1-20, May.
    10. Vanessa Tabea von Kortzfleisch & Oliver Ambrée & Natasha A Karp & Neele Meyer & Janja Novak & Rupert Palme & Marianna Rosso & Chadi Touma & Hanno Würbel & Sylvia Kaiser & Norbert Sachser & S Helene Ri, 2022. "Do multiple experimenters improve the reproducibility of animal studies?," PLOS Biology, Public Library of Science, vol. 20(5), pages 1-21, May.
    11. Laura A. B. Wilson & Susanne R. K. Zajitschek & Malgorzata Lagisz & Jeremy Mason & Hamed Haselimashhadi & Shinichi Nakagawa, 2022. "Sex differences in allometry for phenotypic traits in mice indicate that females are not scaled males," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    12. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    13. Kathryn Oliver & Annette Boaz, 2019. "Transforming evidence for policy and practice: creating space for new conversations," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-10, December.
    14. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    15. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    16. Craig, Russell & Cox, Adam & Tourish, Dennis & Thorpe, Alistair, 2020. "Using retracted journal articles in psychology to understand research misconduct in the social sciences: What is to be done?," Research Policy, Elsevier, vol. 49(4).
    17. Briony Swire-Thompson & David Lazer, 2022. "Reducing Health Misinformation in Science: A Call to Arms," The ANNALS of the American Academy of Political and Social Science, , vol. 700(1), pages 124-135, March.
    18. Katharina Paulick & Simon Seidel & Christoph Lange & Annina Kemmer & Mariano Nicolas Cruz-Bournazou & André Baier & Daniel Haehn, 2022. "Promoting Sustainability through Next-Generation Biologics Drug Development," Sustainability, MDPI, vol. 14(8), pages 1-31, April.
    19. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:3001886. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.