IDEAS home Printed from https://ideas.repec.org/p/ajk/ajkdps/200.html
   My bibliography  Save this paper

P-Hacking, Data Type and Data-Sharing Policy

Author

Listed:
  • Abel Brodeur

    (University of Ottawa and IZA)

  • Nikolai Cook

    (Wilfrid Laurier University)

  • Carina Neisser

    (University of Cologne and IZA)

Abstract

In this paper, we examine the relationship between p-hacking and data sharing policies for published articles. We collect 38,876 test statistics from 1,106 articles published in leading economic journals between 2002–2020. While a data-sharing policy increases the provision of research data to the community, we find a well-estimated null effect that requiring authors to share their data at the time of publication does not alter the presence of p-hacking. Similarly, articles that use hard-to-access administrative data or third-party surveys, as compared to those that use easier-to-access (e.g., own-collected) data are not different in their p-hacking extent. Voluntary provision of data by authors on their homepages offers no evidence of reduced p-hacking.

Suggested Citation

  • Abel Brodeur & Nikolai Cook & Carina Neisser, 2022. "P-Hacking, Data Type and Data-Sharing Policy," ECONtribute Discussion Papers Series 200, University of Bonn and University of Cologne, Germany.
  • Handle: RePEc:ajk:ajkdps:200
    as

    Download full text from publisher

    File URL: https://www.econtribute.de/RePEc/ajk/ajkdps/ECONtribute_200_2022.pdf
    File Function: First version, 2022
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. B.D. McCullough & Kerry Anne McGeary & Teresa D. Harrison, 2008. "Do economics journal archives promote replicable research?," Canadian Journal of Economics, Canadian Economics Association, vol. 41(4), pages 1406-1420, November.
    3. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    4. Arie Kapteyn & Jelmer Y. Ypma, 2007. "Measurement Error and Misclassification: A Comparison of Survey and Administrative Data," Journal of Labor Economics, University of Chicago Press, vol. 25(3), pages 513-551.
    5. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    6. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    7. Tomas Havranek & Anna Sokolova, 2020. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 144 Studies Say 'Probably Not'," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 35, pages 97-122, January.
    8. Jérôme Adda & Christian Decker & Marco Ottaviani, 2020. "P-hacking in clinical trials and how incentives shape the distribution of results across phases," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(24), pages 13386-13392, June.
    9. David Card & Stefano DellaVigna & Patricia Funk & Nagore Iriberri, 2020. "Are Referees and Editors in Economics Gender Neutral?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 135(1), pages 269-327.
    10. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    11. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    12. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    13. Gary S. Becker, 1974. "Crime and Punishment: An Economic Approach," NBER Chapters, in: Essays in the Economics of Crime and Punishment, pages 1-54, National Bureau of Economic Research, Inc.
    14. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    15. T. D. Stanley, 2008. "Meta‐Regression Methods for Detecting and Estimating Empirical Effects in the Presence of Publication Selection," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 70(1), pages 103-127, February.
    16. Matias D. Cattaneo & Michael Jansson & Xinwei Ma, 2020. "Simple Local Polynomial Density Estimators," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(531), pages 1449-1455, July.
    17. Tomáš Havránek & T. D. Stanley & Hristos Doucouliagos & Pedro Bom & Jerome Geyer‐Klingeberg & Ichiro Iwasaki & W. Robert Reed & Katja Rost & R. C. M. van Aert, 2020. "Reporting Guidelines For Meta‐Analysis In Economics," Journal of Economic Surveys, Wiley Blackwell, vol. 34(3), pages 469-475, July.
    18. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    19. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    20. Andrew C. Chang & Phillip Li, 2022. "Is Economics Research Replicable? Sixty Published Papers From Thirteen Journals Say “Often Notâ€," Critical Finance Review, now publishers, vol. 11(1), pages 185-206, February.
    21. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    22. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    23. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    24. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    25. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    26. Feige, Edgar L, 1975. "The Consequences of Journal Editorial Policies and a Suggestion for Revision," Journal of Political Economy, University of Chicago Press, vol. 83(6), pages 1291-1295, December.
    27. Cristina Blanco-Perez & Abel Brodeur, 2019. "Transparency in empirical economic research," IZA World of Labor, Institute of Labor Economics (IZA), pages 467-467, November.
    28. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    29. Daniel S. Hamermesh, 2017. "Replication in Labor Economics: Evidence from Data, and What It Suggests," American Economic Review, American Economic Association, vol. 107(5), pages 37-40, May.
    30. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    31. Tomáš Havránek, 2015. "Measuring Intertemporal Substitution: The Importance Of Method Choices And Selective Reporting," Journal of the European Economic Association, European Economic Association, vol. 13(6), pages 1180-1204, December.
    32. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    33. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    34. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    4. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    5. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    6. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    7. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    8. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    9. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    10. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    11. Zigraiova, Diana & Havranek, Tomas & Irsova, Zuzana & Novak, Jiri, 2021. "How puzzling is the forward premium puzzle? A meta-analysis," European Economic Review, Elsevier, vol. 134(C).
    12. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    13. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    14. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    15. Sebastian Gechert & Tomas Havranek & Zuzana Irsova & Dominika Kolcunova, 2022. "Measuring Capital-Labor Substitution: The Importance of Method Choices and Publication Bias," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 45, pages 55-82, July.
    16. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    17. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    18. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    19. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    20. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.

    More about this item

    Keywords

    p-Hacking; Publication Bias; Data and Code Availability; Data Sharing Policy; Administrative Data; Survey Data;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General
    • C40 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ajk:ajkdps:200. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ECONtribute Office (email available below). General contact details of provider: https://www.econtribute.de .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.