IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/3002082.html
   My bibliography  Save this article

A multi-lab experimental assessment reveals that replicability can be improved by using empirical estimates of genotype-by-lab interaction

Author

Listed:
  • Iman Jaljuli
  • Neri Kafkafi
  • Eliezer Giladi
  • Ilan Golani
  • Illana Gozes
  • Elissa J Chesler
  • Molly A Bogue
  • Yoav Benjamini

Abstract

The utility of mouse and rat studies critically depends on their replicability in other laboratories. A widely advocated approach to improving replicability is through the rigorous control of predefined animal or experimental conditions, known as standardization. However, this approach limits the generalizability of the findings to only to the standardized conditions and is a potential cause rather than solution to what has been called a replicability crisis. Alternative strategies include estimating the heterogeneity of effects across laboratories, either through designs that vary testing conditions, or by direct statistical analysis of laboratory variation. We previously evaluated our statistical approach for estimating the interlaboratory replicability of a single laboratory discovery. Those results, however, were from a well-coordinated, multi-lab phenotyping study and did not extend to the more realistic setting in which laboratories are operating independently of each other. Here, we sought to test our statistical approach as a realistic prospective experiment, in mice, using 152 results from 5 independent published studies deposited in the Mouse Phenome Database (MPD). In independent replication experiments at 3 laboratories, we found that 53 of the results were replicable, so the other 99 were considered non-replicable. Of the 99 non-replicable results, 59 were statistically significant (at 0.05) in their original single-lab analysis, putting the probability that a single-lab statistical discovery was made even though it is non-replicable, at 59.6%. We then introduced the dimensionless “Genotype-by-Laboratory” (GxL) factor—the ratio between the standard deviations of the GxL interaction and the standard deviation within groups. Using the GxL factor reduced the number of single-lab statistical discoveries and alongside reduced the probability of a non-replicable result to be discovered in the single lab to 12.1%. Such reduction naturally leads to reduced power to make replicable discoveries, but this reduction was small (from 87% to 66%), indicating the small price paid for the large improvement in replicability. Tools and data needed for the above GxL adjustment are publicly available at the MPD and will become increasingly useful as the range of assays and testing conditions in this resource increases.The random laboratory model allows investigators to estimate whether an experimental result is likely to replicate based on the typical noise that exists among laboratories that use similar endpoints. This study uses new experimental data to show that such a model, when applied to archived data, improves prediction of replicability for experiments.

Suggested Citation

  • Iman Jaljuli & Neri Kafkafi & Eliezer Giladi & Ilan Golani & Illana Gozes & Elissa J Chesler & Molly A Bogue & Yoav Benjamini, 2023. "A multi-lab experimental assessment reveals that replicability can be improved by using empirical estimates of genotype-by-lab interaction," PLOS Biology, Public Library of Science, vol. 21(5), pages 1-28, May.
  • Handle: RePEc:plo:pbio00:3002082
    DOI: 10.1371/journal.pbio.3002082
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002082
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.3002082&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.3002082?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    2. Vanessa Tabea von Kortzfleisch & Oliver Ambrée & Natasha A Karp & Neele Meyer & Janja Novak & Rupert Palme & Marianna Rosso & Chadi Touma & Hanno Würbel & Sylvia Kaiser & Norbert Sachser & S Helene Ri, 2022. "Do multiple experimenters improve the reproducibility of animal studies?," PLOS Biology, Public Library of Science, vol. 20(5), pages 1-21, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:plo:pone00:0147140 is not listed on IDEAS
    2. repec:plo:pone00:0215221 is not listed on IDEAS
    3. Yan Li & Xiang Zhou & Rui Chen & Xianyang Zhang & Hongyuan Cao, 2024. "STAREG: Statistical replicability analysis of high throughput experiments with applications to spatial transcriptomic studies," PLOS Genetics, Public Library of Science, vol. 20(10), pages 1-19, October.
    4. Seibold, Heidi & Charlton, Alethea & Boulesteix, Anne-Laure & Hoffmann, Sabine, 2020. "Statisticians roll up your sleeves! There’s a crisis to be solved," MetaArXiv frta7, Center for Open Science.
    5. Watzinger, Martin & Schnitzer, Monika, 2019. "Standing on the Shoulders of Science," Rationality and Competition Discussion Paper Series 215, CRC TRR 190 Rationality and Competition.
    6. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    7. repec:plo:pbio00:3000763 is not listed on IDEAS
    8. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    9. repec:osf:metaar:wvdjf_v1 is not listed on IDEAS
    10. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    11. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    12. Julia Koehler Leman & Sergey Lyskov & Steven M. Lewis & Jared Adolf-Bryfogle & Rebecca F. Alford & Kyle Barlow & Ziv Ben-Aharon & Daniel Farrell & Jason Fell & William A. Hansen & Ameya Harmalkar & Je, 2021. "Ensuring scientific reproducibility in bio-macromolecular modeling via extensive, automated benchmarks," Nature Communications, Nature, vol. 12(1), pages 1-15, December.
    13. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    14. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    15. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    16. Wang, Xuefeng & Zhang, Shuo & Liu, Yuqin & Du, Jian & Huang, Heng, 2021. "How pharmaceutical innovation evolves: The path from science to technological development to marketable drugs," Technological Forecasting and Social Change, Elsevier, vol. 167(C).
    17. Martin Backfisch, 2018. "The Development of Firm Size and Innovativeness in the Pharmaceutical industry between 1989 and 2010," MAGKS Papers on Economics 201813, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    18. Matthias Steinfath & Silvia Vogl & Norman Violet & Franziska Schwarz & Hans Mielke & Thomas Selhorst & Matthias Greiner & Gilbert Schönfelder, 2018. "Simple changes of individual studies can improve the reproducibility of the biomedical scientific process as a whole," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-20, September.
    19. Christopher Allen & David M A Mehler, 2019. "Open science challenges, benefits and tips in early career and beyond," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-14, May.
    20. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.
    21. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    22. Stavroula Kousta & Christine Ferguson & Emma Ganley, 2016. "Meta-Research: Broadening the Scope of PLOS Biology," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-2, January.
    23. Adriano Koshiyama & Nick Firoozye, 2019. "Avoiding Backtesting Overfitting by Covariance-Penalties: an empirical investigation of the ordinary and total least squares cases," Papers 1905.05023, arXiv.org.
    24. repec:osf:socarx:4hmb6_v1 is not listed on IDEAS
    25. repec:plo:pone00:0241496 is not listed on IDEAS
    26. Katharina Paulick & Simon Seidel & Christoph Lange & Annina Kemmer & Mariano Nicolas Cruz-Bournazou & André Baier & Daniel Haehn, 2022. "Promoting Sustainability through Next-Generation Biologics Drug Development," Sustainability, MDPI, vol. 14(8), pages 1-31, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:3002082. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.