IDEAS home Printed from https://ideas.repec.org/a/spr/jesaex/v6y2020i2d10.1007_s40881-020-00090-5.html
   My bibliography  Save this article

Power(ful) guidelines for experimental economists

Author

Listed:
  • Kathryn N. Vasilaky

    (California Polytechnic State University
    Columbia University)

  • J. Michelle Brock

    (European Bank for Reconstruction and Development and CEPR)

Abstract

Statistical power is an important detail to consider in the design phase of any experiment. This paper serves as a reference for experimental economists on power calculations. We synthesize many of the questions and issues frequently brought up regarding power calculations and the literature that surrounds that. We provide practical coded examples and tools available for calculating power, and suggest when and how to report power calculations in published studies.

Suggested Citation

  • Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
  • Handle: RePEc:spr:jesaex:v:6:y:2020:i:2:d:10.1007_s40881-020-00090-5
    DOI: 10.1007/s40881-020-00090-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40881-020-00090-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40881-020-00090-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Anderson, Michael L, 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Department of Agricultural & Resource Economics, UC Berkeley, Working Paper Series qt15n8j26f, Department of Agricultural & Resource Economics, UC Berkeley.
    2. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    3. Mariel McKenzie Finucane & Ignacio Martinez & Scott Cody, "undated". "What Works for Whom? A Bayesian Approach to Channeling Big Data Streams for Public Program Evaluation," Mathematica Policy Research Reports 982eef5914cb4e39b91da7114, Mathematica Policy Research.
    4. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    5. Bellemare, Charles & Bissonnette, Luc & Kröger, Sabine, 2014. "Statistical Power of Within and Between-Subjects Designs in Economic Experiments," IZA Discussion Papers 8583, Institute of Labor Economics (IZA).
    6. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    7. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    8. Charness, Gary & Gneezy, Uri & Kuhn, Michael A., 2012. "Experimental methods: Between-subject and within-subject design," Journal of Economic Behavior & Organization, Elsevier, vol. 81(1), pages 1-8.
    9. Jeffrey M Wooldridge, 2010. "Econometric Analysis of Cross Section and Panel Data," MIT Press Books, The MIT Press, edition 2, volume 1, number 0262232588, December.
    10. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2016. "Simulating power of economic experiments: the powerBBK package," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 2(2), pages 157-168, November.
    11. Abhijit Banerjee & Esther Duflo & Amy Finkelstein & Lawrence F. Katz & Benjamin A. Olken & Anja Sautmann, 2020. "In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics," NBER Working Papers 26993, National Bureau of Economic Research, Inc.
    12. Nikos Nikiforakis & Robert Slonim, 2015. "Editors’ preface: statistics, replications and null results," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(2), pages 127-131, December.
    13. Athey, Susan & Imbens, Guido W. & Bayati, Mohsen, 2019. "Optimal Experimental Design for Staggered Rollouts," Research Papers 3837, Stanford University, Graduate School of Business.
    14. Andreas Ortman & Le Zhang, 2013. "Exploring the Meaning of Significance in Experimental Economics," Discussion Papers 2013-32, School of Economics, The University of New South Wales.
    15. Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, 2015. "The veil of experimental currency units in second price auctions," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(2), pages 182-196, December.
    16. Hoenig J. M. & Heisey D. M., 2001. "The Abuse of Power: The Pervasive Fallacy of Power Calculations for Data Analysis," The American Statistician, American Statistical Association, vol. 55, pages 19-24, February.
    17. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    18. Lenth R. V., 2001. "Some Practical Guidelines for Effective Sample Size Determination," The American Statistician, American Statistical Association, vol. 55, pages 187-193, August.
    19. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    20. Johannes Ledolter, 2013. "Economic Field Experiments: Comments on Design Efficiency, Sample Size and Statistical Power," Journal of Economics and Management, College of Business, Feng Chia University, Taiwan, vol. 9(2), pages 271-290, July.
    21. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    22. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bejarano, Hernán & Gillet, Joris & Rodriguez-Lara, Ismael, 2021. "Trust and trustworthiness after negative random shocks," Journal of Economic Psychology, Elsevier, vol. 86(C).
    2. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2021. "Editorial favoritism in the field of laboratory experimental economics (RM/20/014-revised-)," Research Memorandum 005, Maastricht University, Graduate School of Business and Economics (GSBE).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Weili Ding, 2020. "Laboratory experiments can pre-design to address power and selection issues," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 125-138, December.
    3. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    4. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    5. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    6. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2021. "Editorial favoritism in the field of laboratory experimental economics (RM/20/014-revised-)," Research Memorandum 005, Maastricht University, Graduate School of Business and Economics (GSBE).
    7. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    8. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2019. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics," Research Memorandum 029, Maastricht University, Graduate School of Business and Economics (GSBE).
    9. Guigonan S. Adjognon & Daan van Soest & Jonas Guthoff, 2021. "Reducing Hunger with Payments for Environmental Services (PES): Experimental Evidence from Burkina Faso," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(3), pages 831-857, May.
    10. Adjognon,Guigonan Serge & Nguyen Huy,Tung & Guthoff,Jonas Christoph & van Soest,Daan, 2022. "Incentivizing Social Learning for the Diffusion of Climate-Smart Agricultural Techniques," Policy Research Working Paper Series 10041, The World Bank.
    11. Iavor Bojinov & Ashesh Rambachan & Neil Shephard, 2021. "Panel experiments and dynamic causal effects: A finite population perspective," Quantitative Economics, Econometric Society, vol. 12(4), pages 1171-1196, November.
    12. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    13. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    14. Sebastian Jobjörnsson & Henning Schaak & Oliver Musshoff & Tim Friede, 2023. "Improving the statistical power of economic experiments using adaptive designs," Experimental Economics, Springer;Economic Science Association, vol. 26(2), pages 357-382, April.
    15. Dorner, Zack, 2019. "A behavioral rebound effect," Journal of Environmental Economics and Management, Elsevier, vol. 98(C).
    16. Leah H. Palm-Forster & Paul J. Ferraro & Nicholas Janusch & Christian A. Vossler & Kent D. Messer, 2019. "Behavioral and Experimental Agri-Environmental Research: Methodological Challenges, Literature Gaps, and Recommendations," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 719-742, July.
    17. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    18. Alice Tianbo Zhang & Sasmita Patnaik & Shaily Jha & Shalu Agrawal & Carlos F. Gould & Johannes Urpelainen, 2022. "Evidence of multidimensional gender inequality in energy services from a large-scale household survey in India," Nature Energy, Nature, vol. 7(8), pages 698-707, August.
    19. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    20. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital in Children: A Case Study on Scaling," TSE Working Papers 21-1196, Toulouse School of Economics (TSE), revised Oct 2023.

    More about this item

    Keywords

    Power; Experiments; Design; Significance;
    All these keywords.

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jesaex:v:6:y:2020:i:2:d:10.1007_s40881-020-00090-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.