IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/3002082.html
   My bibliography  Save this article

A multi-lab experimental assessment reveals that replicability can be improved by using empirical estimates of genotype-by-lab interaction

Author

Listed:
  • Iman Jaljuli
  • Neri Kafkafi
  • Eliezer Giladi
  • Ilan Golani
  • Illana Gozes
  • Elissa J Chesler
  • Molly A Bogue
  • Yoav Benjamini

Abstract

The utility of mouse and rat studies critically depends on their replicability in other laboratories. A widely advocated approach to improving replicability is through the rigorous control of predefined animal or experimental conditions, known as standardization. However, this approach limits the generalizability of the findings to only to the standardized conditions and is a potential cause rather than solution to what has been called a replicability crisis. Alternative strategies include estimating the heterogeneity of effects across laboratories, either through designs that vary testing conditions, or by direct statistical analysis of laboratory variation. We previously evaluated our statistical approach for estimating the interlaboratory replicability of a single laboratory discovery. Those results, however, were from a well-coordinated, multi-lab phenotyping study and did not extend to the more realistic setting in which laboratories are operating independently of each other. Here, we sought to test our statistical approach as a realistic prospective experiment, in mice, using 152 results from 5 independent published studies deposited in the Mouse Phenome Database (MPD). In independent replication experiments at 3 laboratories, we found that 53 of the results were replicable, so the other 99 were considered non-replicable. Of the 99 non-replicable results, 59 were statistically significant (at 0.05) in their original single-lab analysis, putting the probability that a single-lab statistical discovery was made even though it is non-replicable, at 59.6%. We then introduced the dimensionless “Genotype-by-Laboratory” (GxL) factor—the ratio between the standard deviations of the GxL interaction and the standard deviation within groups. Using the GxL factor reduced the number of single-lab statistical discoveries and alongside reduced the probability of a non-replicable result to be discovered in the single lab to 12.1%. Such reduction naturally leads to reduced power to make replicable discoveries, but this reduction was small (from 87% to 66%), indicating the small price paid for the large improvement in replicability. Tools and data needed for the above GxL adjustment are publicly available at the MPD and will become increasingly useful as the range of assays and testing conditions in this resource increases.The random laboratory model allows investigators to estimate whether an experimental result is likely to replicate based on the typical noise that exists among laboratories that use similar endpoints. This study uses new experimental data to show that such a model, when applied to archived data, improves prediction of replicability for experiments.

Suggested Citation

  • Iman Jaljuli & Neri Kafkafi & Eliezer Giladi & Ilan Golani & Illana Gozes & Elissa J Chesler & Molly A Bogue & Yoav Benjamini, 2023. "A multi-lab experimental assessment reveals that replicability can be improved by using empirical estimates of genotype-by-lab interaction," PLOS Biology, Public Library of Science, vol. 21(5), pages 1-28, May.
  • Handle: RePEc:plo:pbio00:3002082
    DOI: 10.1371/journal.pbio.3002082
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002082
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.3002082&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.3002082?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    2. Vanessa Tabea von Kortzfleisch & Oliver Ambrée & Natasha A Karp & Neele Meyer & Janja Novak & Rupert Palme & Marianna Rosso & Chadi Touma & Hanno Würbel & Sylvia Kaiser & Norbert Sachser & S Helene Ri, 2022. "Do multiple experimenters improve the reproducibility of animal studies?," PLOS Biology, Public Library of Science, vol. 20(5), pages 1-21, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:plo:pone00:0215221 is not listed on IDEAS
    2. Yan Li & Xiang Zhou & Rui Chen & Xianyang Zhang & Hongyuan Cao, 2024. "STAREG: Statistical replicability analysis of high throughput experiments with applications to spatial transcriptomic studies," PLOS Genetics, Public Library of Science, vol. 20(10), pages 1-19, October.
    3. Watzinger, Martin & Schnitzer, Monika, 2019. "Standing on the Shoulders of Science," Rationality and Competition Discussion Paper Series 215, CRC TRR 190 Rationality and Competition.
    4. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    5. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    6. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    7. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    8. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    9. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    10. Martin Backfisch, 2018. "The Development of Firm Size and Innovativeness in the Pharmaceutical industry between 1989 and 2010," MAGKS Papers on Economics 201813, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    11. Matthias Steinfath & Silvia Vogl & Norman Violet & Franziska Schwarz & Hans Mielke & Thomas Selhorst & Matthias Greiner & Gilbert Schönfelder, 2018. "Simple changes of individual studies can improve the reproducibility of the biomedical scientific process as a whole," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-20, September.
    12. Christopher Allen & David M A Mehler, 2019. "Open science challenges, benefits and tips in early career and beyond," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-14, May.
    13. Stavroula Kousta & Christine Ferguson & Emma Ganley, 2016. "Meta-Research: Broadening the Scope of PLOS Biology," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-2, January.
    14. Adriano Koshiyama & Nick Firoozye, 2019. "Avoiding Backtesting Overfitting by Covariance-Penalties: an empirical investigation of the ordinary and total least squares cases," Papers 1905.05023, arXiv.org.
    15. repec:plo:pone00:0241496 is not listed on IDEAS
    16. repec:osf:metaar:yxba5_v1 is not listed on IDEAS
    17. Hannah Fraser & Tim Parker & Shinichi Nakagawa & Ashley Barnett & Fiona Fidler, 2018. "Questionable research practices in ecology and evolution," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-16, July.
    18. Salandra, Rossella, 2018. "Knowledge dissemination in clinical trials: Exploring influences of institutional support and type of innovation on selective reporting," Research Policy, Elsevier, vol. 47(7), pages 1215-1228.
    19. Estelle Dumas-Mallet & Andy Smith & Thomas Boraud & François Gonon, 2017. "Poor replication validity of biomedical association studies reported by newspapers," PLOS ONE, Public Library of Science, vol. 12(2), pages 1-15, February.
    20. repec:plo:pone00:0213266 is not listed on IDEAS
    21. Markus Lehmkuhl & Nikolai Promies, 2020. "Frequency distribution of journalistic attention for scientific studies and scientific sources: An input–output analysis," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-20, November.
    22. Michaël Bikard, 2018. "Made in Academia: The Effect of Institutional Origin on Inventors’ Attention to Science," Organization Science, INFORMS, vol. 29(5), pages 818-836, October.
    23. Tim P Ahuis & Magdalena K Smyk & Clément Laloux & Katharina Aulehner & Jack Bray & Ann-Marie Waldron & Nina Miljanovic & Isabel Seiffert & Dekun Song & Bruno Boulanger & Mathias Jucker & Heidrun Potsc, 2024. "Evaluation of variation in preclinical electroencephalographic (EEG) spectral power across multiple laboratories and experiments: An EQIPD study," PLOS ONE, Public Library of Science, vol. 19(10), pages 1-35, October.
    24. Solveig Runge & Silvia Zedtwitz & Alexander M. Maucher & Philipp Bruno & Lisa Osbelt & Bei Zhao & Anne M. Gernand & Till R. Lesker & Katja Gräwe & Manuel Rogg & Christoph Schell & Melanie Boerries & T, 2025. "Laboratory mice engrafted with natural gut microbiota possess a wildling-like phenotype," Nature Communications, Nature, vol. 16(1), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:3002082. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.