IDEAS home Printed from https://ideas.repec.org/p/mar/magkse/202032.html
   My bibliography  Save this paper

The power of experiments: How big is your n?

Author

Listed:
  • Igor Asanov

    (University of Kassel)

  • Christoph Buehren

    (Clausthal University of Technology)

  • Panagiota Zacharodimou

    (European Parliament)

Abstract

The replicability and credibility crisis in psychology and economics sparked the debate on underpowered experiments, publication biases, and p-hacking. Analyzing the number of independent observations of experiments published in Experimental Economics, Games and Economic Behavior, and the Journal of Economic Behavior and Organization, we observe that we did not learn much from this debate. The median experiment in our sample has too few independent observations and, thus, is underpowered. Moreover, we find indications for biases in reporting highly significant results. We investigate for which papers and experiments it is more likely to find reporting biases, and we suggest remedies that could help to overcome the replicability crisis.

Suggested Citation

  • Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
  • Handle: RePEc:mar:magkse:202032
    as

    Download full text from publisher

    File URL: https://www.uni-marburg.de/en/fb02/research-groups/economics/macroeconomics/research/magks-joint-discussion-papers-in-economics/papers/2020-papers/32-2020_asanov.pdf
    File Function: First 202032
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    2. Stephen T. Ziliak & Deirdre N. McCloskey, 2004. "Size Matters: The Standard Error of Regressions in the American Economic Review," Econ Journal Watch, Econ Journal Watch, vol. 1(2), pages 331-358, August.
    3. John Hudson, 1996. "Trends in Multi-authored Papers in Economics," Journal of Economic Perspectives, American Economic Association, vol. 10(3), pages 153-158, Summer.
    4. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    5. Duk Gyoo Kim, 2020. "Clustering Standard Errors at the "Session" Level," CESifo Working Paper Series 8386, CESifo.
    6. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    7. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "High-Dimensional Methods and Inference on Structural and Treatment Effects," Journal of Economic Perspectives, American Economic Association, vol. 28(2), pages 29-50, Spring.
    8. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    9. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    10. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2014. "Statistical Power of Within and Between-Subjects Designs in Economic Experiments," Cahiers de recherche 1403, Centre de recherche sur les risques, les enjeux économiques, et les politiques publiques.
    11. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    12. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    13. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    14. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    15. Luigi Butera & John List, 2017. "An Economic Approach to Alleviate the Crisis of Confidence in Science: With an Application to the Public Goods Game," Artefactual Field Experiments 00608, The Field Experiments Website.
    16. Andreas Ortman & Le Zhang, 2013. "Exploring the Meaning of Significance in Experimental Economics," Discussion Papers 2013-32, School of Economics, The University of New South Wales.
    17. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    18. James Berry & Lucas C. Coffman & Douglas Hanley & Rania Gihleb & Alistair J. Wilson, 2017. "Assessing the Rate of Replication in Economics," American Economic Review, American Economic Association, vol. 107(5), pages 27-31, May.
    19. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    20. Urs Fischbacher, 2007. "z-Tree: Zurich toolbox for ready-made economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 10(2), pages 171-178, June.
    21. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    22. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    23. Eric Luis Uhlmann & Anthony Bastardi & Lee Ross, 2011. "Wishful Thinking: Belief, Desire, and the Motivated Evaluation of Scientific Evidence," Post-Print hal-00609541, HAL.
    24. repec:cup:judgdm:v:6:y:2011:i:8:p:870-881 is not listed on IDEAS
    25. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    26. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    27. Asanov, Igor & Vannuccini, Simone, 2020. "Short- and Long-run Effects of External Interventions on Trust," Review of Behavioral Economics, now publishers, vol. 7(2), pages 159-195, May.
    28. Luigi Butera & Philip J Grossman & Daniel Houser & John A List & Marie Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science With An Application to the Public Goods GameA Review," Working Papers halshs-02512932, HAL.
    29. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    30. McKenzie, David, 2012. "Beyond baseline and follow-up: The case for more T in experiments," Journal of Development Economics, Elsevier, vol. 99(2), pages 210-221.
    31. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    32. Halbert White, 2000. "A Reality Check for Data Snooping," Econometrica, Econometric Society, vol. 68(5), pages 1097-1126, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    3. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    4. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    5. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    6. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    7. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    8. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    9. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    10. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    11. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    12. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    13. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    14. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    15. Moritz A. Drupp & Menusch Khadjavi & Rudi Voss, 2024. "The Truth-Telling of Truth-Seekers: Evidence from Online Experiments with Scientists," CESifo Working Paper Series 10897, CESifo.
    16. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    17. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    18. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    19. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    20. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    Statistical power; statistical significance; meta-study; balanced randomization; caliper test;
    All these keywords.

    JEL classification:

    • C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:mar:magkse:202032. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Bernd Hayo (email available below). General contact details of provider: https://edirc.repec.org/data/vamarde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.