IDEAS home Printed from
   My bibliography  Save this paper

Is there a cult of statistical significance in Agricultural Economics?


  • Rommel, Jens
  • Weltin, Meike


In an analysis of articles published in ten years of the American Economic Review, Deirdre McCloskey and Stephen Ziliak have shown that economists often fail to adequately distinguish economic and statistical significance. In this paper, we briefly review their arguments and develop a ten-item questionnaire on the statistical practice in the Agricultural Economics community. We apply our questionnaire to the 2015 volumes of the American Journal of Agricultural Economics, the European Review of Agricultural Economics, the Journal of Agricultural Economics, and the American Economic Review. We specifically focus on the “sizeless stare” and the negligence of economic significance. Our initial results indicate that there is room of improvement in statistical practice. Empirical papers rarely consider the power of statistical tests or run simulations. The economic consequences of estimation results are often not adequately addressed. We discuss the implications of our findings for the publication process and teaching in Agricultural Economics.

Suggested Citation

  • Rommel, Jens & Weltin, Meike, 2017. "Is there a cult of statistical significance in Agricultural Economics?," 57th Annual Conference, Weihenstephan, Germany, September 13-15, 2017 261998, German Association of Agricultural Economists (GEWISOLA).
  • Handle: RePEc:ags:gewi17:261998
    DOI: 10.22004/ag.econ.261998

    Download full text from publisher

    File URL:
    Download Restriction: no

    File URL:
    Download Restriction: no

    References listed on IDEAS

    1. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    2. Stephen T. Ziliak & Deirdre N. McCloskey, 2004. "Size Matters: The Standard Error of Regressions in the American Economic Review," Econ Journal Watch, Econ Journal Watch, vol. 1(2), pages 331-358, August.
    3. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    4. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    5. Martin B. Hackmann & Jonathan T. Kolstad & Amanda E. Kowalski, 2015. "Adverse Selection and an Individual Mandate: When Theory Meets Practice," American Economic Review, American Economic Association, vol. 105(3), pages 1030-1066, March.
    6. Marcella Alsan, 2015. "The Effect of the TseTse Fly on African Development," American Economic Review, American Economic Association, vol. 105(1), pages 382-410, January.
    7. Altman, Morris, 2004. "Statistical significance, path dependency, and the culture of journal publication," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 33(5), pages 651-663, November.
    8. Ronald L. Wasserstein & Nicole A. Lazar, 2016. "The ASA's Statement on p -Values: Context, Process, and Purpose," The American Statistician, Taylor & Francis Journals, vol. 70(2), pages 129-133, May.
    9. Steven Tadelis & Florian Zettelmeyer, 2015. "Information Disclosure as a Matching Mechanism: Theory and Evidence from a Field Experiment," American Economic Review, American Economic Association, vol. 105(2), pages 886-905, February.
    10. Alessandro Bonanno & Rui Huang & Yizao Liu, 2015. "Editor's choice Simulating welfare effects of the European nutrition and health claims’ regulation: the Italian yogurt market," European Review of Agricultural Economics, Foundation for the European Review of Agricultural Economics, vol. 42(3), pages 499-533.
    11. Johannes Sauer & Uwe Latacz-Lohmann, 2015. "Investment, technical change and efficiency: empirical evidence from German dairy production," European Review of Agricultural Economics, Foundation for the European Review of Agricultural Economics, vol. 42(1), pages 151-175.
    12. Stefano Castriota & Marco Delmastro, 2015. "The Economics of Collective Reputation: Evidence from the Wine Industry," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 97(2), pages 469-489.
    13. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    14. Alfonso Flores-Lagunes & Troy Timko, 2015. "Does Participation in 4-H Improve Schooling Outcomes? Evidence from Florida," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 97(2), pages 414-434.
    15. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Frey Ulrich & Theesfeld Insa & Wagner Peter, 2016. "Die Interpretation des p-Wertes – Grundsätzliche Missverständnisse," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 236(5), pages 557-575, October.
    16. Roland Herrmann & Ernst Berg & Stephan Dabbert & Siegfried Pöchtrager & Klaus Salhofer, 2011. "Going Beyond Impact Factors: A Survey‐based Journal Ranking by Agricultural Economists," Journal of Agricultural Economics, Wiley Blackwell, vol. 62(3), pages 710-732, September.
    Full references (including those not matched with items on IDEAS)


    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.

    Cited by:

    1. Janzen, Sarah & Michler, Jeffrey D, 2020. "Ulysses' Pact or Ulysses' Raft: Using Pre-Analysis Plans in Experimental and Non-Experimental Research," MetaArXiv wkmht, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    3. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    4. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    5. Zhimin Zhou & Xinyue Ye & Xiangyu Ge, 2017. "The Impacts of Technical Progress on Sulfur Dioxide Kuznets Curve in China: A Spatial Panel Data Approach," Sustainability, MDPI, Open Access Journal, vol. 9(4), pages 1-27, April.
    6. Alexander Libman & Joachim Zweynert, 2014. "Ceremonial Science: The State of Russian Economics Seen Through the Lens of the Work of ‘Doctor of Science’ Candidates," Working Papers 337, Leibniz Institut für Ost- und Südosteuropaforschung (Institute for East and Southeast European Studies).
    7. Gunter, Ulrich & Önder, Irem & Smeral, Egon, 2019. "Scientific value of econometric tourism demand studies," Annals of Tourism Research, Elsevier, vol. 78(C), pages 1-1.
    8. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    9. Uwe Hassler & Marc-Oliver Pohle, 2020. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," Papers 2009.02198,
    10. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    11. Libman, Alexander & Zweynert, Joachim, 2014. "Ceremonial science: The state of Russian economics seen through the lens of the work of ‘Doctor of Science’ candidates," Economic Systems, Elsevier, vol. 38(3), pages 360-378.
    12. Christensen, Garret & Miguel, Edward & Sturdy, Jennifer, 2017. "Transparency, Reproducibility, and the Credibility of Economics Research," MetaArXiv 9a3rw, Center for Open Science.
    13. Klaus E Meyer & Arjen Witteloostuijn & Sjoerd Beugelsdijk, 2017. "What’s in a p? Reassessing best practices for conducting and reporting hypothesis-testing research," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 48(5), pages 535-551, July.
    14. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    15. John D. Levendis, 2018. "Time Series Econometrics," Springer Texts in Business and Economics, Springer, number 978-3-319-98282-3, April.
    16. Altman, Morris, 2020. "A more scientific approach to applied economics: Reconstructing statistical, analytical significance, and correlation analysis," Economic Analysis and Policy, Elsevier, vol. 66(C), pages 315-324.
    17. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    18. Kim, Jae H. & Ji, Philip Inyeob, 2015. "Significance testing in empirical finance: A critical review and assessment," Journal of Empirical Finance, Elsevier, vol. 34(C), pages 1-14.
    19. Bruns, Stephan B. & Kalthaus, Martin, 2020. "Flexibility in the selection of patent counts: Implications for p-hacking and evidence-based policymaking," Research Policy, Elsevier, vol. 49(1).
    20. Blakeley B. McShane & David Gal, 2016. "Blinding Us to the Obvious? The Effect of Statistical Training on the Evaluation of Evidence," Management Science, INFORMS, vol. 62(6), pages 1707-1718, June.

    More about this item


    Research Methods/ Statistical Methods;

    NEP fields

    This paper has been announced in the following NEP Reports:


    Access and download statistics


    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ags:gewi17:261998. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (AgEcon Search). General contact details of provider: .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.