IDEAS home Printed from https://ideas.repec.org/p/ajk/ajkdps/169.html
   My bibliography  Save this paper

The Null Result Penalty

Author

Listed:
  • Felix Chopra

    (University of Bonn)

  • Ingar Haaland

    (Haaland:University of Bergen)

  • Christopher Roth

    (University of Cologne, ECONtribute)

  • Andreas Stegmann

    (UniversityofWarwick)

Abstract

In experiments with economists, we measure how the evaluation of research studies depends on whether the study yielded a null result. Studies with null results are perceived to be less publishable, of lower quality, less important, and less precisely estimated than studies with statistically significant results, even when holding constant all other study features, including the precision of estimates. The penalty for null results is of similar magnitude for various subgroups of researchers, from PhD students to editors. The null result penalty is larger when experts predict a non-null result and when statistical uncertainty is communicated in terms of p-values rather than standard errors. Our findings have implications for understanding mechanisms underlying publication bias and the communication of research findings.

Suggested Citation

  • Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2022. "The Null Result Penalty," ECONtribute Discussion Papers Series 169, University of Bonn and University of Cologne, Germany.
  • Handle: RePEc:ajk:ajkdps:169
    as

    Download full text from publisher

    File URL: https://www.econtribute.de/RePEc/ajk/ajkdps/ECONtribute_169_2022.pdf
    File Function: First version, 2022
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Peter Andrebriq & Carlo Pizzinelli & Christopher Roth & Johannes Wohlfart, 2022. "Subjective Models of the Macroeconomy: Evidence From Experts and Representative Samples," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 89(6), pages 2958-2991.
    2. Jonas Hjort & Diana Moreira & Gautam Rao & Juan Francisco Santini, 2021. "How Research Affects Policy: Experimental Evidence from 2,150 Brazilian Municipalities," American Economic Review, American Economic Association, vol. 111(5), pages 1442-1480, May.
    3. Jonathan de Quidt & Johannes Haushofer & Christopher Roth, 2018. "Measuring and Bounding Experimenter Demand," American Economic Review, American Economic Association, vol. 108(11), pages 3266-3302, November.
    4. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    5. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    8. David Card & Stefano DellaVigna & Patricia Funk & Nagore Iriberri, 2020. "Are Referees and Editors in Economics Gender Neutral?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 135(1), pages 269-327.
    9. Peter Andre & Armin Falk, 2021. "What’s Worth Knowing? Economists’ Opinions about Economics," ECONtribute Discussion Papers Series 102, University of Bonn and University of Cologne, Germany.
    10. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    11. Peter Andre & Ingar Haaland & Christopher Roth & Johannes Wohlfart, 2021. "Narratives about the Macroeconomy," CEBI working paper series 21-18, University of Copenhagen. Department of Economics. The Center for Economic Behavior and Inequality (CEBI).
    12. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    13. Kasy, Maximilian, 2019. "Selective publication of findings: Why does it matter, and what should we do about it?," MetaArXiv xwngs, Center for Open Science.
    14. Daniel J. Benjamin & Sebastian A. Brown & Jesse M. Shapiro, 2013. "Who Is ‘Behavioral’? Cognitive Ability And Anomalous Preferences," Journal of the European Economic Association, European Economic Association, vol. 11(6), pages 1231-1255, December.
    15. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    16. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    17. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    18. Bogdanoski, Aleksandar & Foster, Andrew & Karlan, Dean & Miguel, Edward, 2020. "Pre-results Review at the Journal of Development Economics: Lessons learned," MetaArXiv 5yacr, Center for Open Science.
    19. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    20. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    21. Anna, Petrenko, 2016. "Мaркування готової продукції як складова частина інформаційного забезпечення маркетингової діяльності підприємств овочепродуктового підкомплексу," Agricultural and Resource Economics: International Scientific E-Journal, Agricultural and Resource Economics: International Scientific E-Journal, vol. 2(1), March.
    22. Berinsky, Adam J. & Druckman, James N. & Yamamoto, Teppei, 2021. "Publication Biases in Replication Studies," Political Analysis, Cambridge University Press, vol. 29(3), pages 370-384, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alexander L. Brown & Taisuke Imai & Ferdinand M. Vieider & Colin F. Camerer, 2024. "Meta-analysis of Empirical Estimates of Loss Aversion," Journal of Economic Literature, American Economic Association, vol. 62(2), pages 485-516, June.
    2. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    3. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," I4R Discussion Paper Series 8, The Institute for Replication (I4R).
    4. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    5. Fang, Ximeng & Innocenti, Stefania, 2023. "Increasing the acceptability of carbon taxation: The role of social norms and economic reasoning," INET Oxford Working Papers 2023-25, Institute for New Economic Thinking at the Oxford Martin School, University of Oxford.
    6. Garg, Prashant & Fetzer, Thiemo, 2024. "Causal Claims in Economics," I4R Discussion Paper Series 183, The Institute for Replication (I4R).
    7. Burro, Giovanni & Castagnetti, Alessandro, 2022. "Will I tell you that you are smart (dumb)? Deceiving Others about their IQ or about a Random Draw," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    8. Patrick Dylong & Paul Setzepfand & Silke Uebelmesser, 2023. "Priming Attitudes Towards Immigrants: Implications for Migration Research and Survey Design," CESifo Working Paper Series 10306, CESifo.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    3. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    4. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    5. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    6. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    7. Garg, Prashant & Fetzer, Thiemo, 2024. "Causal Claims in Economics," I4R Discussion Paper Series 183, The Institute for Replication (I4R).
    8. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    9. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    11. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    12. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    13. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    14. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    15. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    16. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    17. Patrick Vu, 2022. "Can the Replication Rate Tell Us About Publication Bias?," Papers 2206.15023, arXiv.org, revised Jul 2022.
    18. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    19. Antinyan, Armenak & Asatryan, Zareh, 2019. "Nudging for tax compliance: A meta-analysis," ZEW Discussion Papers 19-055, ZEW - Leibniz Centre for European Economic Research.
    20. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ajk:ajkdps:169. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ECONtribute Office (email available below). General contact details of provider: https://www.econtribute.de .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.