IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0158064.html
   My bibliography  Save this article

Replication Validity of Initial Association Studies: A Comparison between Psychiatry, Neurology and Four Somatic Diseases

Author

Listed:
  • Estelle Dumas-Mallet
  • Katherine Button
  • Thomas Boraud
  • Marcus Munafo
  • François Gonon

Abstract

Context: There are growing concerns about effect size inflation and replication validity of association studies, but few observational investigations have explored the extent of these problems. Objective: Using meta-analyses to measure the reliability of initial studies and explore whether this varies across biomedical domains and study types (cognitive/behavioral, brain imaging, genetic and “others”). Methods: We analyzed 663 meta-analyses describing associations between markers or risk factors and 12 pathologies within three biomedical domains (psychiatry, neurology and four somatic diseases). We collected the effect size, sample size, publication year and Impact Factor of initial studies, largest studies (i.e., with the largest sample size) and the corresponding meta-analyses. Initial studies were considered as replicated if they were in nominal agreement with meta-analyses and if their effect size inflation was below 100%. Results: Nominal agreement between initial studies and meta-analyses regarding the presence of a significant effect was not better than chance in psychiatry, whereas it was somewhat better in neurology and somatic diseases. Whereas effect sizes reported by largest studies and meta-analyses were similar, most of those reported by initial studies were inflated. Among the 256 initial studies reporting a significant effect (p

Suggested Citation

  • Estelle Dumas-Mallet & Katherine Button & Thomas Boraud & Marcus Munafo & François Gonon, 2016. "Replication Validity of Initial Association Studies: A Comparison between Psychiatry, Neurology and Four Somatic Diseases," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-20, June.
  • Handle: RePEc:plo:pone00:0158064
    DOI: 10.1371/journal.pone.0158064
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0158064
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0158064&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0158064?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Daniel Sarewitz, 2012. "Beware the creeping cracks of bias," Nature, Nature, vol. 485(7397), pages 149-149, May.
    3. Shareen A Iqbal & Joshua D Wallach & Muin J Khoury & Sheri D Schully & John P A Ioannidis, 2016. "Reproducible Research Practices and Transparency across the Biomedical Literature," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-13, January.
    4. Sean P David & Jennifer J Ware & Isabella M Chu & Pooja D Loftus & Paolo Fusar-Poli & Joaquim Radua & Marcus R Munafò & John P A Ioannidis, 2013. "Potential Reporting Bias in fMRI Studies of the Brain," PLOS ONE, Public Library of Science, vol. 8(7), pages 1-9, July.
    5. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    6. François Gonon & Jan-Pieter Konsman & David Cohen & Thomas Boraud, 2012. "Why Most Biomedical Findings Echoed by Newspapers Turn Out to be False: The Case of Attention Deficit Hyperactivity Disorder," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-11, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Andrew D Higginson & Marcus R Munafò, 2016. "Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-14, November.
    2. Mattia Prosperi & Jiang Bian & Iain E. Buchan & James S. Koopman & Matthew Sperrin & Mo Wang, 2019. "Raiders of the lost HARK: a reproducible inference framework for big data science," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    3. Estelle Dumas-Mallet & Andy Smith & Thomas Boraud & François Gonon, 2017. "Poor replication validity of biomedical association studies reported by newspapers," PLOS ONE, Public Library of Science, vol. 12(2), pages 1-15, February.
    4. Owen, P. Dorian, 2018. "Replication to assess statistical adequacy," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-16.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Estelle Dumas-Mallet & Andy Smith & Thomas Boraud & François Gonon, 2017. "Poor replication validity of biomedical association studies reported by newspapers," PLOS ONE, Public Library of Science, vol. 12(2), pages 1-15, February.
    2. Stavroula Kousta & Christine Ferguson & Emma Ganley, 2016. "Meta-Research: Broadening the Scope of PLOS Biology," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-2, January.
    3. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    4. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    5. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    6. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    7. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    8. Louis Anthony (Tony) Cox, 2015. "Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks," Risk Analysis, John Wiley & Sons, vol. 35(10), pages 1892-1910, October.
    9. Christopher Allen & David M A Mehler, 2019. "Open science challenges, benefits and tips in early career and beyond," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-14, May.
    10. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.
    11. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    12. Adriano Koshiyama & Nick Firoozye, 2019. "Avoiding Backtesting Overfitting by Covariance-Penalties: an empirical investigation of the ordinary and total least squares cases," Papers 1905.05023, arXiv.org.
    13. Louis Anthony (Tony) Cox, 2013. "Improving Causal Inferences in Risk Analysis," Risk Analysis, John Wiley & Sons, vol. 33(10), pages 1762-1771, October.
    14. Hannah Fraser & Tim Parker & Shinichi Nakagawa & Ashley Barnett & Fiona Fidler, 2018. "Questionable research practices in ecology and evolution," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-16, July.
    15. Salandra, Rossella, 2018. "Knowledge dissemination in clinical trials: Exploring influences of institutional support and type of innovation on selective reporting," Research Policy, Elsevier, vol. 47(7), pages 1215-1228.
    16. David Chavalarias, 2017. "What’s wrong with Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 481-503, January.
    17. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
    18. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    19. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    20. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0158064. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.