IDEAS home Printed from https://ideas.repec.org/p/hal/wpaper/halshs-01158500.html
   My bibliography  Save this paper

Star Wars: The Empirics Strike Back

Author

Listed:
  • Abel Brodeur

    (University of Ottawa [Ottawa])

  • Mathias Lé

    (PSE - Paris School of Economics - UP1 - Université Paris 1 Panthéon-Sorbonne - ENS-PSL - École normale supérieure - Paris - PSL - Université Paris Sciences et Lettres - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement, PSE - Paris-Jourdan Sciences Economiques - ENS-PSL - École normale supérieure - Paris - PSL - Université Paris Sciences et Lettres - INRA - Institut National de la Recherche Agronomique - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique)

  • Marc Sangnier

    (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - EHESS - École des hautes études en sciences sociales - AMU - Aix Marseille Université - ECM - École Centrale de Marseille - CNRS - Centre National de la Recherche Scientifique)

  • Yanos Zylberberg

    (School of Economics Finance and Management - University of Bristol [Bristol])

Abstract

Journals favor rejection of the null hypothesis. This selection upon tests may distort the behavior of researchers. Using 50,000 tests published between 2005 and 2011 in the AER, JPE, and QJE, we identify a residual in the distribution of tests that cannot be explained by selection. The distribution of p-values exhibits a two humped camel shape with abundant p-values above 0.25, a valley between 0.25 and 0.10, and a bump slightly below 0.05. The missing tests (with p-values between 0.25 and 0.10) can be retrieved just after the 0.05 threshold and represent 10% to 20% of marginally rejected tests. Our interpretation is that researchers might be tempted to inflate the value of those just-rejected tests by choosing a "significant" specification. We propose a method to measure this residual and describe how it varies by article and author characteristics.

Suggested Citation

  • Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2015. "Star Wars: The Empirics Strike Back," Working Papers halshs-01158500, HAL.
  • Handle: RePEc:hal:wpaper:halshs-01158500
    Note: View the original document on HAL open archive server: https://shs.hal.science/halshs-01158500
    as

    Download full text from publisher

    File URL: https://shs.hal.science/halshs-01158500/document
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Emeric Henry, 2009. "Strategic Disclosure of Research Results: The Cost of Proving Your Honesty," Economic Journal, Royal Economic Society, vol. 119(539), pages 1036-1064, July.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. Mario Forni & Luca Gambetti & Marco Lippi & Luca Sala, 2017. "Noise Bubbles," Economic Journal, Royal Economic Society, vol. 127(604), pages 1940-1976, September.
    4. Card, David & Krueger, Alan B, 1995. "Time-Series Minimum-Wage Studies: A Meta-analysis," American Economic Review, American Economic Association, vol. 85(2), pages 238-243, May.
    5. Orley Ashenfelter & Michael Greenstone, 2004. "Estimating the Value of a Statistical Life: The Importance of Omitted Variables and Publication Bias," American Economic Review, American Economic Association, vol. 94(2), pages 454-460, May.
    6. Denton, Frank T, 1985. "Data Mining as an Industry," The Review of Economics and Statistics, MIT Press, vol. 67(1), pages 124-127, February.
    7. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    8. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    9. B.D. McCullough & Kerry Anne McGeary & Teresa D. Harrison, 2008. "Do economics journal archives promote replicable research?," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 41(4), pages 1406-1420, November.
    10. Orley Ashenfelter & Colm Harmon & Hessel Oosterbeek, 1999. "A Review of Estimates of the Schooling/Earnings Relationship, with Tests for Publication Bias," Working Papers 804, Princeton University, Department of Economics, Industrial Relations Section..
    11. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    12. David F. Hendry & Hans‐Martin Krolzig, 2004. "We Ran One Regression," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 66(5), pages 799-810, December.
    13. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    14. David F. Hendry & Hans-Martin Krolzig, 2004. "We Ran One Regression," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 66(5), pages 799-810, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    4. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    5. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    6. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    7. Havranek, Tomas & Horvath, Roman & Irsova, Zuzana & Rusnak, Marek, 2015. "Cross-country heterogeneity in intertemporal substitution," Journal of International Economics, Elsevier, vol. 96(1), pages 100-118.
    8. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    9. Stephan B. Bruns, 2013. "Identifying Genuine Effects in Observational Research by Means of Meta-Regressions," Jena Economics Research Papers 2013-040, Friedrich-Schiller-University Jena.
    10. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    11. Tomas Havranek, 2013. "Publication Bias in Measuring Intertemporal Substitution," Working Papers IES 2013/15, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Oct 2013.
    12. Tomas Havranek & Anna Sokolova, 2016. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 130 Studies Say "Probably Not"," Working Papers 2016/08, Czech National Bank.
    13. Ali Elminejad & Tomas Havranek & Roman Horvath & Zuzana Irsova, 2023. "Intertemporal Substitution in Labor Supply: A Meta-Analysis," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 51, pages 1095-1113, December.
    14. Tomas Havranek & Anna Sokolova, 2020. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 144 Studies Say 'Probably Not'," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 35, pages 97-122, January.
    15. Sebastian Gechert & Tomas Havranek & Zuzana Irsova & Dominika Kolcunova, 2022. "Measuring Capital-Labor Substitution: The Importance of Method Choices and Publication Bias," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 45, pages 55-82, July.
    16. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    17. Stephan B. Bruns, 2016. "The Fragility of Meta-Regression Models in Observational Research," MAGKS Papers on Economics 201603, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    18. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    19. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    20. Polák, Petr, 2017. "The productivity paradox: A meta-analysis," Information Economics and Policy, Elsevier, vol. 38(C), pages 38-54.

    More about this item

    Keywords

    hypothesis testing; distorting incentives; selection bias; research in economics;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General
    • C44 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - Operations Research; Statistical Decision Theory

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:wpaper:halshs-01158500. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.