IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0277935.html
   My bibliography  Save this article

Are most published research findings false in a continuous universe?

Author

Listed:
  • Kleber Neves
  • Pedro B Tan
  • Olavo B Amaral

Abstract

Diagnostic screening models for the interpretation of null hypothesis significance test (NHST) results have been influential in highlighting the effect of selective publication on the reproducibility of the published literature, leading to John Ioannidis’ much-cited claim that most published research findings are false. These models, however, are typically based on the assumption that hypotheses are dichotomously true or false, without considering that effect sizes for different hypotheses are not the same. To address this limitation, we develop a simulation model that overcomes this by modeling effect sizes explicitly using different continuous distributions, while retaining other aspects of previous models such as publication bias and the pursuit of statistical significance. Our results show that the combination of selective publication, bias, low statistical power and unlikely hypotheses consistently leads to high proportions of false positives, irrespective of the effect size distribution assumed. Using continuous effect sizes also allows us to evaluate the degree of effect size overestimation and prevalence of estimates with the wrong sign in the literature, showing that the same factors that drive false-positive results also lead to errors in estimating effect size direction and magnitude. Nevertheless, the relative influence of these factors on different metrics varies depending on the distribution assumed for effect sizes. The model is made available as an R ShinyApp interface, allowing one to explore features of the literature in various scenarios.

Suggested Citation

  • Kleber Neves & Pedro B Tan & Olavo B Amaral, 2022. "Are most published research findings false in a continuous universe?," PLOS ONE, Public Library of Science, vol. 17(12), pages 1-18, December.
  • Handle: RePEc:plo:pone00:0277935
    DOI: 10.1371/journal.pone.0277935
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0277935
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0277935&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0277935?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Blakeley B. McShane & David Gal, 2017. "Rejoinder: Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 904-908, July.
    2. repec:plo:pbio00:1002273 is not listed on IDEAS
    3. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    4. Blakeley B. McShane & David Gal & Andrew Gelman & Christian Robert & Jennifer L. Tackett, 2019. "Abandon Statistical Significance," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 235-245, March.
    5. Clarissa F D Carneiro & Thiago C Moulin & Malcolm R Macleod & Olavo B Amaral, 2018. "Effect size and statistical power in the rodent fear conditioning literature – A systematic review," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-27, April.
    6. Matthias Steinfath & Silvia Vogl & Norman Violet & Franziska Schwarz & Hans Mielke & Thomas Selhorst & Matthias Greiner & Gilbert Schönfelder, 2018. "Simple changes of individual studies can improve the reproducibility of the biomedical scientific process as a whole," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-20, September.
    7. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    8. Blakeley B. McShane & David Gal, 2017. "Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 885-895, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David J. Hand, 2022. "Trustworthiness of statistical inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 185(1), pages 329-347, January.
    2. repec:osf:metaar:tp45u_v1 is not listed on IDEAS
    3. Rinne, Sonja, 2024. "Estimating the merit-order effect using coarsened exact matching: Reconciling theory with the empirical results to improve policy implications," Energy Policy, Elsevier, vol. 185(C).
    4. Bertoldi, Paolo & Mosconi, Rocco, 2020. "Do energy efficiency policies save energy? A new approach based on energy policy indicators (in the EU Member States)," Energy Policy, Elsevier, vol. 139(C).
    5. Maier, Maximilian & VanderWeele, Tyler & Mathur, Maya B, 2021. "Using Selection Models to Assess Sensitivity to Publication Bias: A Tutorial and Call for More Routine Use," MetaArXiv tp45u, Center for Open Science.
    6. Anderson, Brian S. & Wennberg, Karl & McMullen, Jeffery S., 2019. "Editorial: Enhancing quantitative theory-testing entrepreneurship research," Journal of Business Venturing, Elsevier, vol. 34(5), pages 1-1.
    7. Wennberg, Karl & Anderson, Brian S. & McMullen, Jeffrey, 2019. "2 Editorial: Enhancing Quantitative Theory-Testing Entrepreneurship Research," Ratio Working Papers 323, The Ratio Institute.
    8. Maya B. Mathur & Tyler J. VanderWeele, 2020. "Sensitivity analysis for publication bias in meta‐analyses," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(5), pages 1091-1119, November.
    9. Anderson, Brian S., 2022. "What executives get wrong about statistics: Moving from statistical significance to effect sizes and practical impact," Business Horizons, Elsevier, vol. 65(3), pages 379-388.
    10. J. M. Bauer & L. A. Reisch, 2019. "Behavioural Insights and (Un)healthy Dietary Choices: a Review of Current Evidence," Journal of Consumer Policy, Springer, vol. 42(1), pages 3-45, March.
    11. Jeffrey A. Mills & Gary Cornwall & Beau A. Sauley & Jeffrey R. Strawn, 2018. "Improving the Analysis of Randomized Controlled Trials: a Posterior Simulation Approach," BEA Working Papers 0157, Bureau of Economic Analysis.
    12. Han Wang & Sieglinde S Snapp & Monica Fisher & Frederi Viens, 2019. "A Bayesian analysis of longitudinal farm surveys in Central Malawi reveals yield determinants and site-specific management strategies," PLOS ONE, Public Library of Science, vol. 14(8), pages 1-17, August.
    13. Tom Engsted, 2024. "What Is the False Discovery Rate in Empirical Research?," Econ Journal Watch, Econ Journal Watch, vol. 21(1), pages 1-92–112, March.
    14. repec:osf:metaar:yxba5_v1 is not listed on IDEAS
    15. Luigi Pace & Alessandra Salvan, 2020. "Likelihood, Replicability and Robbins' Confidence Sequences," International Statistical Review, International Statistical Institute, vol. 88(3), pages 599-615, December.
    16. Glenn Shafer, 2021. "Testing by betting: A strategy for statistical and scientific communication," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 407-431, April.
    17. Maximilian Maier & Tyler J. VanderWeele & Maya B. Mathur, 2022. "Using selection models to assess sensitivity to publication bias: A tutorial and call for more routine use," Campbell Systematic Reviews, John Wiley & Sons, vol. 18(3), September.
    18. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Becker Claudia, 2019. "Twenty Steps Towards an Adequate Inferential Interpretation of p-Values in Econometrics," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 239(4), pages 703-721, August.
    19. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
    20. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    21. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    22. Laurent Busca & Charlotte Massa, 2019. "L'Etude Exploratoire, Uniquement Qualitative ? Vers La Reconnaissance D'Une Approche Quantitative Exploratoire," Post-Print hal-04791500, HAL.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0277935. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.