IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/rkumy_v1.html
   My bibliography  Save this paper

Preprint of Too good to be false: Nonsignificant results revisited

Author

Listed:
  • Hartgerink, Chris Hubertus Joseph

    (Tilburg University)

  • Wicherts, Jelte M.

    (Tilburg University)

  • van Assen, Marcel A. L. M.

Abstract

Due to its probabilistic nature, Null Hypothesis Significance Testing (NHST) is subject to decision errors. The concern for false positives has overshadowed the concern for false negatives in the recent debates in psychology. This is unwarranted, since reported statistically nonsignificant findings may just be 'too good to be false'. We examined evidence for false negatives in nonsignificant results in three different ways. We adapted the Fisher method to detect the presence of at least one false negative in a set of statistically nonsignificant results. Simulations show that the adapted Fisher method generally is a powerful method to detect false negatives. We examined evidence for false negatives in the psychology literature in three applications of the adapted Fisher method. These applications indicate that (i) the observed effect size distribution of nonsignificant effects exceeds the expected distribution assuming a null-effect, and approximately two out of three (66.7%) psychology articles reporting nonsignificant results contain evidence for at least one false negative, (ii) nonsignificant results on gender effects contain evidence of true nonzero effects, and (iii) the statistically nonsignificant replications from the Reproducibility Project Psychology (RPP) do not warrant conclusions about the absence or presence of true zero effects underlying these nonsignificant results. We conclude that false negatives deserve more attention in the current debate on statistical practices in psychology. Neglecting effects due to a lack of statistical power can lead to a waste of research resources and stifle the scientific discovery process.

Suggested Citation

  • Hartgerink, Chris Hubertus Joseph & Wicherts, Jelte M. & van Assen, Marcel A. L. M., 2016. "Preprint of Too good to be false: Nonsignificant results revisited," OSF Preprints rkumy_v1, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:rkumy_v1
    DOI: 10.31219/osf.io/rkumy_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/580db1d5b83f6901dcc935b9/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/rkumy_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    2. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    3. repec:osf:socarx:4hmb6_v1 is not listed on IDEAS
    4. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    5. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    6. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    7. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    8. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    9. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    10. Alexandru Marcoci & David P. Wilkinson & Ans Vercammen & Bonnie C. Wintle & Anna Lou Abatayo & Ernest Baskin & Henk Berkman & Erin M. Buchanan & Sara Capitán & Tabaré Capitán & Ginny Chan & Kent Jason, 2025. "Predicting the replicability of social and behavioural science claims in COVID-19 preprints," Nature Human Behaviour, Nature, vol. 9(2), pages 287-304, February.
    11. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    12. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    13. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    14. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    15. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    16. Spyros Galanis & Christos A Ioannou & Stelios Kotronis, 2024. "Information Aggregation Under Ambiguity: Theory and Experimental Evidence," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 91(6), pages 3423-3467.
    17. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    18. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    19. Doucouliagos, Hristos & Paldam, Martin & Stanley, T.D., 2018. "Skating on thin evidence: Implications for public policy," European Journal of Political Economy, Elsevier, vol. 54(C), pages 16-25.
    20. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    21. Kai Ruggeri & Amma Panin & Milica Vdovic & Bojana Većkalov & Nazeer Abdul-Salaam & Jascha Achterberg & Carla Akil & Jolly Amatya & Kanchan Amatya & Thomas Lind Andersen & Sibele D. Aquino & Arjoon Aru, 2022. "The globalizability of temporal discounting," Nature Human Behaviour, Nature, vol. 6(10), pages 1386-1397, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:rkumy_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.