IDEAS home Printed from https://ideas.repec.org/a/wly/emetrp/v90y2022i2p887-906.html
   My bibliography  Save this article

Detecting p‐Hacking

Author

Listed:
  • Graham Elliott
  • Nikolay Kudrin
  • Kaspar Wüthrich

Abstract

We theoretically analyze the problem of testing for p‐hacking based on distributions of p‐values across multiple studies. We provide general results for when such distributions have testable restrictions (are non‐increasing) under the null of no p‐hacking. We find novel additional testable restrictions for p‐values based on t‐tests. Specifically, the shape of the power functions results in both complete monotonicity as well as bounds on the distribution of p‐values. These testable restrictions result in more powerful tests for the null hypothesis of no p‐hacking. When there is also publication bias, our tests are joint tests for p‐hacking and publication bias. A reanalysis of two prominent data sets shows the usefulness of our new tests.

Suggested Citation

  • Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
  • Handle: RePEc:wly:emetrp:v:90:y:2022:i:2:p:887-906
    DOI: 10.3982/ECTA18583
    as

    Download full text from publisher

    File URL: https://doi.org/10.3982/ECTA18583
    Download Restriction: no

    File URL: https://libkey.io/10.3982/ECTA18583?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Cattaneo, Matias D & Jansson, Michael & Ma, Xinwei, 2020. "Simple Local Polynomial Density Estimators," University of California at San Diego, Economics Working Paper Series qt9vt997qn, Department of Economics, UC San Diego.
    4. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    5. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    6. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    7. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    8. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    9. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    10. Snyder, Christopher & Zhuo, Ran, 2018. "Sniff Tests in Economics: Aggregate Distribution of Their Probability Values and Implications for Publication Bias," MetaArXiv 8vdrh, Center for Open Science.
    11. Cattaneo, Matias D & Jansson, Michael & Ma, Xinwei, 2020. "Simple Local Polynomial Density Estimators," Department of Economics, Working Paper Series qt9vt997qn, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    12. Matias D. Cattaneo & Michael Jansson & Xinwei Ma, 2020. "Simple Local Polynomial Density Estimators," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(531), pages 1449-1455, July.
    13. Romano, Joseph P. & Wolf, Michael, 2013. "Testing for monotonicity in expected asset returns," Journal of Empirical Finance, Elsevier, vol. 23(C), pages 93-116.
    14. Christopher A. Carolan & Joshua M. Tebbs, 2005. "Nonparametric tests for and against likelihood ratio ordering in the two-sample problem," Biometrika, Biometrika Trust, vol. 92(1), pages 159-171, March.
    15. Beare, Brendan K. & Moon, Jong-Myun, 2015. "Nonparametric Tests Of Density Ratio Ordering," Econometric Theory, Cambridge University Press, vol. 31(3), pages 471-492, June.
    16. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    17. P. M. Hartigan, 1985. "Computation of the Dip Statistic to Test for Unimodality," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 34(3), pages 320-325, November.
    18. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
    19. Joseph P. Romano & Michael Wolf, 2011. "Testing for monotonicity in expected asset returns," ECON - Working Papers 017, Department of Economics - University of Zurich, revised Jan 2013.
    20. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Enzinger, Matthias & Gechert, Sebastian & Heimberger, Philipp & Prante, Franz & Romero, Daniel F., 2025. "The overstated effects of conventional monetary policy on output and prices," I4R Discussion Paper Series 264, The Institute for Replication (I4R).
    2. Guido W. Imbens, 2021. "Statistical Significance, p-Values, and the Reporting of Uncertainty," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 157-174, Summer.
    3. Vu, Patrick, 2024. "Why are replication rates so low?," Journal of Econometrics, Elsevier, vol. 245(1).
    4. Josef Bajzik & Jan Janku & Simona Malovana & Klara Moravcova & Ngoc Anh Ngo, 2023. "Monetary Policy Has a Long-Lasting Impact on Credit: Evidence from 91 VAR Studies," Working Papers 2023/19, Czech National Bank, Research and Statistics Department.
    5. Simona Malovaná & Martin Hodula & Zuzana Gric & Josef Bajzík, 2025. "Borrower‐based macroprudential measures and credit growth: How biased is the existing literature?," Journal of Economic Surveys, Wiley Blackwell, vol. 39(1), pages 66-102, February.
    6. Ali Elminejad & Tomas Havranek & Roman Horvath & Zuzana Irsova, 2023. "Intertemporal Substitution in Labor Supply: A Meta-Analysis," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 51, pages 1095-1113, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Aug 2025.
    2. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    3. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    4. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    5. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    6. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    7. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    8. Brodeur, Abel & Cook, Nikolai M. & Heyes, Anthony & Wright, Taylor, 2025. "Media Stars: Statistical Significance and Research Impact," I4R Discussion Paper Series 254, The Institute for Replication (I4R).
    9. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    10. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    11. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    12. Matteo Picchio & Michele Ubaldi, 2024. "Unemployment and health: A meta‐analysis," Journal of Economic Surveys, Wiley Blackwell, vol. 38(4), pages 1437-1472, September.
    13. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    14. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    15. repec:osf:metaar:a9vhr_v1 is not listed on IDEAS
    16. Tomas Havranek & Zuzana Irsova & Lubica Laslopova & Olesia Zeynalova, 2020. "Skilled and Unskilled Labor Are Less Substitutable than Commonly Thought," Working Papers IES 2020/29, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    17. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    18. Cristina Blanco-Perez & Abel Brodeur, 2019. "Transparency in empirical economic research," IZA World of Labor, Institute of Labor Economics (IZA), pages 467-467, November.
    19. Ali Elminejad & Tomas Havranek & Roman Horvath & Zuzana Irsova, 2023. "Intertemporal Substitution in Labor Supply: A Meta-Analysis," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 51, pages 1095-1113, December.
    20. Vu, Patrick, 2024. "Why are replication rates so low?," Journal of Econometrics, Elsevier, vol. 245(1).
    21. Yuta Okamoto & Yuuki Ozaki, 2024. "On Extrapolation of Treatment Effects in Multiple-Cutoff Regression Discontinuity Designs," Papers 2412.04265, arXiv.org, revised Sep 2025.

    More about this item

    Lists

    This item is featured on the following reading lists, Wikipedia, or ReplicationWiki pages:
    1. Meta-Research in Economics

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:emetrp:v:90:y:2022:i:2:p:887-906. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/essssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.