IDEAS home Printed from https://ideas.repec.org/a/jns/jbstat/v239y2019i4p703-721n8.html
   My bibliography  Save this article

Twenty Steps Towards an Adequate Inferential Interpretation of p-Values in Econometrics

Author

Listed:
  • Hirschauer Norbert
  • Grüner Sven

    (Faculty of Natural Sciences III, Institute of Agricultural and Nutritional Sciences, Agribusiness Management, Martin Luther University Halle-Wittenberg, Karl-Freiherr-von-Fritsch-Str. 4, D-06120 Halle(Saale), Germany)

  • Mußhoff Oliver

    (Department for Agricultural Economics and Rural Development, Farm Management, Georg August University Göttingen, Platz der Göttinger Sieben 5, D-37073Göttingen, Germany)

  • Becker Claudia

    (Faculty of Law and Economics, Institute of Business Studies, Chair of Statistics, Martin Luther University Halle-Wittenberg, Große Steinstraße 73, D-06099 Halle(Saale), Germany)

Abstract

We suggest twenty immediately actionable steps to reduce widespread inferential errors related to “statistical significance testing.” Our propositions refer to the theoretical preconditions for using p-values. They furthermore include wording guidelines as well as structural and operative advice on how to present results, especially in research based on multiple regression analysis, the working horse of empirical economists. Our propositions aim at fostering the logical consistency of inferential arguments by avoiding false categorical reasoning. They are not aimed at dispensing with p-values or completely replacing frequentist approaches by Bayesian statistics.

Suggested Citation

  • Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Becker Claudia, 2019. "Twenty Steps Towards an Adequate Inferential Interpretation of p-Values in Econometrics," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 239(4), pages 703-721, August.
  • Handle: RePEc:jns:jbstat:v:239:y:2019:i:4:p:703-721:n:8
    DOI: 10.1515/jbnst-2018-0069
    as

    Download full text from publisher

    File URL: https://doi.org/10.1515/jbnst-2018-0069
    Download Restriction: no

    File URL: https://libkey.io/10.1515/jbnst-2018-0069?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. Blakeley B. McShane & David Gal, 2017. "Rejoinder: Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 904-908, July.
    4. Maren Duvendack & Richard Palmer-Jones & W. Robert Reed, 2017. "What Is Meant by "Replication" and Why Does It Encounter Resistance in Economics?," American Economic Review, American Economic Association, vol. 107(5), pages 46-51, May.
    5. Sellke T. & Bayarri M. J. & Berger J. O., 2001. "Calibration of rho Values for Testing Precise Null Hypotheses," The American Statistician, American Statistical Association, vol. 55, pages 62-71, February.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Walter Krämer, 2011. "The Cult of Statistical Significance – What Economists Should and Should Not Do to Make their Data Talk," Schmollers Jahrbuch : Journal of Applied Social Science Studies / Zeitschrift für Wirtschafts- und Sozialwissenschaften, Duncker & Humblot, Berlin, vol. 131(3), pages 455-468.
    8. Maren Duvendack & Richard W. Palmer-Jones & W. Robert Reed, 2015. "Replications in Economics: A Progress Report," Econ Journal Watch, Econ Journal Watch, vol. 12(2), pages 164–191-1, May.
    9. Andrew Gelman & John Carlin, 2017. "Some Natural Solutions to the -Value Communication Problem—and Why They Won’t Work," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 899-901, July.
    10. Donald Berry, 2017. "A -Value to Die For," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 895-897, July.
    11. Gelman, Andrew & Stern, Hal, 2006. "The Difference Between," The American Statistician, American Statistical Association, vol. 60, pages 328-331, November.
    12. Stephen T. Ziliak, 2016. "Statistical significance and scientific misconduct: improving the style of the published research paper," Review of Social Economy, Taylor & Francis Journals, vol. 74(1), pages 83-97, March.
    13. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Frey Ulrich & Theesfeld Insa & Wagner Peter, 2016. "Die Interpretation des p-Wertes – Grundsätzliche Missverständnisse," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 236(5), pages 557-575, October.
    14. John Ioannidis & Chris Doucouliagos, 2013. "What'S To Know About The Credibility Of Empirical Economics?," Journal of Economic Surveys, Wiley Blackwell, vol. 27(5), pages 997-1004, December.
    15. Blakeley B. McShane & David Gal, 2017. "Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 885-895, July.
    16. Danilov, Dmitry & Magnus, J.R.Jan R., 2004. "On the harm that ignoring pretesting can cause," Journal of Econometrics, Elsevier, vol. 122(1), pages 27-46, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hirschauer, Norbert & Grüner, Sven & Mußhoff, Oliver & Becker, Claudia & Jantsch, Antje, 2020. "Can p-values be meaningfully interpreted without random sampling?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 14, pages 71-91.
    2. Grüner Sven, 2020. "Sample Size Calculation in Economic Experiments," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 240(6), pages 791-823, December.
    3. Herzfeld, Thomas & Akhmadiyeva, Zarema, 2021. "Agricultural labour in transition: An update," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 22(3), pages 144-160.
    4. David J. Hand, 2022. "Trustworthiness of statistical inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 185(1), pages 329-347, January.
    5. Hirschauer, Norbert & Grüner, Sven & Mußhoff, Oliver & Becker, Claudia, 2020. "Inference in economic experiments," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 14, pages 1-14.
    6. Hirschauer, Norbert & Gruener, Sven & Mußhoff, Oliver & Becker, Claudia, 2020. "A primer on p-value thresholds and α-levels – two different kettles of fish," SocArXiv d46m2, Center for Open Science.
    7. Elisa Giampietri & Giuseppe Bugin & Samuele Trestini, 2021. "On the association between risk attitude and fruit and vegetable consumption: insights from university students in Italy," Agricultural and Food Economics, Springer;Italian Society of Agricultural Economics (SIDEA), vol. 9(1), pages 1-16, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jens Rommel & Meike Weltin, 2021. "Is There a Cult of Statistical Significance in Agricultural Economics?," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(3), pages 1176-1191, September.
    2. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Frey Ulrich & Theesfeld Insa & Wagner Peter, 2016. "Die Interpretation des p-Wertes – Grundsätzliche Missverständnisse," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 236(5), pages 557-575, October.
    3. Hirschauer, Norbert & Grüner, Sven & Mußhoff, Oliver & Becker, Claudia & Jantsch, Antje, 2020. "Can p-values be meaningfully interpreted without random sampling?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 14, pages 71-91.
    4. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    5. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    6. Kim, Jae H. & Ji, Philip Inyeob, 2015. "Significance testing in empirical finance: A critical review and assessment," Journal of Empirical Finance, Elsevier, vol. 34(C), pages 1-14.
    7. Blakeley B. McShane & David Gal, 2016. "Blinding Us to the Obvious? The Effect of Statistical Training on the Evaluation of Evidence," Management Science, INFORMS, vol. 62(6), pages 1707-1718, June.
    8. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    9. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    10. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    11. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    12. Jae H. Kim & Kamran Ahmed & Philip Inyeob Ji, 2018. "Significance Testing in Accounting Research: A Critical Evaluation Based on Evidence," Abacus, Accounting Foundation, University of Sydney, vol. 54(4), pages 524-546, December.
    13. Anderson, Brian S. & Wennberg, Karl & McMullen, Jeffery S., 2019. "Editorial: Enhancing quantitative theory-testing entrepreneurship research," Journal of Business Venturing, Elsevier, vol. 34(5), pages 1-1.
    14. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    15. Wennberg, Karl & Anderson, Brian S. & McMullen, Jeffrey, 2019. "2 Editorial: Enhancing Quantitative Theory-Testing Entrepreneurship Research," Ratio Working Papers 323, The Ratio Institute.
    16. David J. Hand, 2022. "Trustworthiness of statistical inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 185(1), pages 329-347, January.
    17. J. M. Bauer & L. A. Reisch, 2019. "Behavioural Insights and (Un)healthy Dietary Choices: a Review of Current Evidence," Journal of Consumer Policy, Springer, vol. 42(1), pages 3-45, March.
    18. Jeffrey A. Mills & Gary Cornwall & Beau A. Sauley & Jeffrey R. Strawn, 2018. "Improving the Analysis of Randomized Controlled Trials: a Posterior Simulation Approach," BEA Working Papers 0157, Bureau of Economic Analysis.
    19. Snyder, Christopher & Zhuo, Ran, 2018. "Sniff Tests in Economics: Aggregate Distribution of Their Probability Values and Implications for Publication Bias," MetaArXiv 8vdrh, Center for Open Science.
    20. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:jns:jbstat:v:239:y:2019:i:4:p:703-721:n:8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.degruyter.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.