IDEAS home Printed from https://ideas.repec.org/p/zbw/i4rdps/52.html
   My bibliography  Save this paper

Unpacking P-Hacking and Publication Bias

Author

Listed:
  • Brodeur, Abel
  • Carrell, Scott
  • Figlio, David
  • Lusher, Lester

Abstract

We use unique data from journal submissions to identify and unpack publication bias and p-hacking. We find that initial submissions display significant bunching, suggesting the distribution among published statistics cannot be fully attributed to a publication bias in peer review. Desk-rejected manuscripts display greater heaping than those sent for review i.e. marginally significant results are more likely to be desk rejected. Reviewer recommendations, in contrast, are positively associated with statistical significance. Overall, the peer review process has little effect on the distribution of test statistics. Lastly, we track rejected papers and present evidence that the prevalence of publication biases is perhaps not as prominent as feared.

Suggested Citation

  • Brodeur, Abel & Carrell, Scott & Figlio, David & Lusher, Lester, 2023. "Unpacking P-Hacking and Publication Bias," I4R Discussion Paper Series 52, The Institute for Replication (I4R).
  • Handle: RePEc:zbw:i4rdps:52
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/274141/1/I4R-DP052.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Scott Carrell & David Figlio & Lester Lusher, 2024. "Clubs and Networks in Economics Reviewing," Journal of Political Economy, University of Chicago Press, vol. 132(9), pages 2999-3024.
    2. David Card & Stefano DellaVigna & Patricia Funk & Nagore Iriberri, 2020. "Are Referees and Editors in Economics Gender Neutral?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 135(1), pages 269-327.
    3. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    4. Sebastian Kranz & Peter Pütz, 2022. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics: Comment," American Economic Review, American Economic Association, vol. 112(9), pages 3124-3136, September.
    5. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    6. T. D. Stanley, 2005. "Beyond Publication Bias," Journal of Economic Surveys, Wiley Blackwell, vol. 19(3), pages 309-345, July.
    7. Travis J. Lybbert & Steven T. Buccola, 2021. "The evolving ethics of analysis, publication, and transparency in applied economics," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1330-1351, December.
    8. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    9. Tomáš Havránek, 2015. "Measuring Intertemporal Substitution: The Importance Of Method Choices And Selective Reporting," Journal of the European Economic Association, European Economic Association, vol. 13(6), pages 1180-1204, December.
    10. Tomas Havranek & Anna Sokolova, 2020. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 144 Studies Say 'Probably Not'," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 35, pages 97-122, January.
    11. T. D. Stanley, 2008. "Meta‐Regression Methods for Detecting and Estimating Empirical Effects in the Presence of Publication Selection," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 70(1), pages 103-127, February.
    12. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    13. Ashenfelter, Orley & Harmon, Colm & Oosterbeek, Hessel, 1999. "A review of estimates of the schooling/earnings relationship, with tests for publication bias," Labour Economics, Elsevier, vol. 6(4), pages 453-470, November.
    14. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    15. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    16. Matias D. Cattaneo & Michael Jansson & Xinwei Ma, 2020. "Simple Local Polynomial Density Estimators," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(531), pages 1449-1455, July.
    17. Orley Ashenfelter & Colm Harmon & Hessel Oosterbeek, 1999. "A Review of Estimates of the Schooling/Earnings Relationship, with Tests for Publication Bias," Working Papers 804, Princeton University, Department of Economics, Industrial Relations Section..
    18. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    19. repec:fth:prinin:425 is not listed on IDEAS
    20. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    21. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    22. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    23. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    24. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    25. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    26. McCloskey, Donald N, 1985. "The Loss Function Has Been Mislaid: The Rhetoric of Significance Tests," American Economic Review, American Economic Association, vol. 75(2), pages 201-205, May.
    27. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    28. Miguel, E & Camerer, C & Casey, K & Cohen, J & Esterling, KM & Gerber, A & Glennerster, R & Green, DP & Humphreys, M & Imbens, G & Laitin, D & Madon, T & Nelson, L & Nosek, BA & Petersen, M & Sedlmayr, 2014. "Promoting Transparency in Social Science Research," Department of Economics, Working Paper Series qt0wt4q2q8, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    29. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    30. Tomáš Havránek & T. D. Stanley & Hristos Doucouliagos & Pedro Bom & Jerome Geyer‐Klingeberg & Ichiro Iwasaki & W. Robert Reed & Katja Rost & R. C. M. van Aert, 2020. "Reporting Guidelines For Meta‐Analysis In Economics," Journal of Economic Surveys, Wiley Blackwell, vol. 34(3), pages 469-475, July.
    31. George K. Ofosu & Daniel N. Posner, 2020. "Do Pre-analysis Plans Hamper Publication?," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 70-74, May.
    32. Paul J. Ferraro & Pallavi Shukla, 2020. "Feature—Is a Replicability Crisis on the Horizon for Environmental and Resource Economics?," Review of Environmental Economics and Policy, University of Chicago Press, vol. 14(2), pages 339-351.
    33. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Irsova, Zuzana & Bom, Pedro R. D. & Havranek, Tomas & Rachinger, Heiko, 2023. "Spurious Precision in Meta-Analysis," EconStor Preprints 268683, ZBW - Leibniz Information Centre for Economics.
    2. Rose, Julian & Neubauer, Florian & Ankel-Peters, Jörg, 2024. "Long-term effects of the targeting the ultra-poor program: A reproducibility and replicability assessment of Banerjee et al. (2021)," Ruhr Economic Papers 1107, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    3. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," I4R Discussion Paper Series 38, The Institute for Replication (I4R).
    4. Jordan C. Stanley & Evan S. Totty, 2024. "Synthetic Data and Social Science Research: Accuracy Assessments and Practical Considerations from the SIPP Synthetic Beta," NBER Chapters, in: Data Privacy Protection and the Conduct of Applied Research: Methods, Approaches and their Consequences, National Bureau of Economic Research, Inc.
    5. Danielle V. Handel & Eric A. Hanushek, 2024. "Contexts of Convenience: Generalizing from Published Evaluations of School Finance Policies," Evaluation Review, , vol. 48(3), pages 461-494, June.
    6. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    7. Irsova, Zuzana & Doucouliagos, Hristos & Havranek, Tomas & Stanley, T. D., 2023. "Meta-Analysis of Social Science Research: A Practitioner’s Guide," EconStor Preprints 273719, ZBW - Leibniz Information Centre for Economics.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    2. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    3. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    4. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    5. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    6. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    7. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    8. Ali Elminejad & Tomas Havranek & Roman Horvath & Zuzana Irsova, 2023. "Intertemporal Substitution in Labor Supply: A Meta-Analysis," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 51, pages 1095-1113, December.
    9. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    10. Zigraiova, Diana & Havranek, Tomas & Irsova, Zuzana & Novak, Jiri, 2021. "How puzzling is the forward premium puzzle? A meta-analysis," European Economic Review, Elsevier, vol. 134(C).
    11. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    12. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    13. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    14. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    15. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    16. Kroupova, Katerina & Havranek, Tomas & Irsova, Zuzana, 2024. "Student Employment and Education: A Meta-Analysis," Economics of Education Review, Elsevier, vol. 100(C).
    17. Cazachevici, Alina & Havranek, Tomas & Horvath, Roman, 2020. "Remittances and economic growth: A meta-analysis," World Development, Elsevier, vol. 134(C).
    18. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    19. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    20. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.

    More about this item

    Keywords

    publication bias; p-hacking; selective reporting;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General
    • C40 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:i4rdps:52. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://www.i4replication.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.