IDEAS home Printed from https://ideas.repec.org/a/spr/jesaex/v6y2020i2d10.1007_s40881-020-00089-y.html
   My bibliography  Save this article

Laboratory experiments can pre-design to address power and selection issues

Author

Listed:
  • Weili Ding

    (Queen’s University)

Abstract

In this paper, motivated by aspects of preregistration plans we discuss issues that we believe have important implications for how experiments are designed. To make possible valid inferences about the effects of a treatment in question, we first illustrate how economic theories can help allocate subjects across treatments in a manner that boosts statistical power. Using data from two laboratory experiments where subject behavior deviated sharply from theory, we show that the ex-post subject allocation to maximize statistical power is closer to these ex-ante calculations relative to traditional designs that balances the number of subjects across treatments. Finally, we call for increased attention to (i) the appropriate levels of the type I and type II errors for power calculations, and (ii) how experimenters consider balance in part by properly handling over-subscription to sessions.

Suggested Citation

  • Weili Ding, 2020. "Laboratory experiments can pre-design to address power and selection issues," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 125-138, December.
  • Handle: RePEc:spr:jesaex:v:6:y:2020:i:2:d:10.1007_s40881-020-00089-y
    DOI: 10.1007/s40881-020-00089-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40881-020-00089-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40881-020-00089-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Baron, David P. & Ferejohn, John A., 1989. "Bargaining in Legislatures," American Political Science Review, Cambridge University Press, vol. 83(4), pages 1181-1206, December.
    3. Ham, John C. & Kagel, John H. & Lehrer, Steven F., 2005. "Randomization, endogeneity and laboratory experiments: the role of cash balances in private value auctions," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 175-205.
    4. Matthew Embrey & Guillaume R. Fréchette & Steven F. Lehrer, 2015. "Bargaining and Reputation: An Experiment on Bargaining in the Presence of Behavioural Types," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 82(2), pages 608-631.
    5. Weili Ding & Steven F. Lehrer, 2010. "Estimating Treatment Effects from Contaminated Multiperiod Education Experiments: The Dynamic Impacts of Class Size Reductions," The Review of Economics and Statistics, MIT Press, vol. 92(1), pages 31-42, February.
    6. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    7. Charness, Gary & Gneezy, Uri & Kuhn, Michael A., 2012. "Experimental methods: Between-subject and within-subject design," Journal of Economic Behavior & Organization, Elsevier, vol. 81(1), pages 1-8.
    8. Marco Casari & John C. Ham & John H. Kagel, 2007. "Selection Bias, Demographic Effects, and Ability Effects in Common Value Auction Experiments," American Economic Review, American Economic Association, vol. 97(4), pages 1278-1304, September.
    9. Ham, John C & LaLonde, Robert J, 1996. "The Effect of Sample Selection and Initial Conditions in Duration Models: Evidence from Experimental Data on Training," Econometrica, Econometric Society, vol. 64(1), pages 175-205, January.
    10. Weili Ding & Steven Lehrer, 2011. "Experimental estimates of the impacts of class size on test scores: robustness and heterogeneity," Education Economics, Taylor & Francis Journals, vol. 19(3), pages 229-252.
    11. Charles F. Manski, 2019. "Treatment Choice With Trial Data: Statistical Decision Theory Should Supplant Hypothesis Testing," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 296-304, March.
    12. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    13. Nikos Nikiforakis & Robert Slonim, 2015. "Editors’ preface: statistics, replications and null results," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(2), pages 127-131, December.
    14. Frã‰Chette, Guillaume R. & Kagel, John H. & Lehrer, Steven F., 2003. "Bargaining in Legislatures: An Experimental Investigation of Open versus Closed Amendment Rules," American Political Science Review, Cambridge University Press, vol. 97(2), pages 221-232, May.
    15. Kagel, John H & Harstad, Ronald M & Levin, Dan, 1987. "Information Impact and Allocation Rules in Auctions with Affiliated Private Values: A Laboratory Study," Econometrica, Econometric Society, vol. 55(6), pages 1275-1304, November.
    16. Hoenig J. M. & Heisey D. M., 2001. "The Abuse of Power: The Pervasive Fallacy of Power Calculations for Data Analysis," The American Statistician, American Statistical Association, vol. 55, pages 19-24, February.
    17. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    18. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    19. Dilip Abreu & Faruk Gul, 2000. "Bargaining and Reputation," Econometrica, Econometric Society, vol. 68(1), pages 85-118, January.
    20. Slonim, Robert & Wang, Carmen & Garbarino, Ellen & Merrett, Danielle, 2012. "Opting-In: Participation Biases in the Lab," IZA Discussion Papers 6865, Institute of Labor Economics (IZA).
    21. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    22. Roth, Alvin E., 1986. "Laboratory Experimentation in Economics," Economics and Philosophy, Cambridge University Press, vol. 2(2), pages 245-273, October.
    23. Engle, Robert F., 1984. "Wald, likelihood ratio, and Lagrange multiplier tests in econometrics," Handbook of Econometrics, in: Z. Griliches† & M. D. Intriligator (ed.), Handbook of Econometrics, edition 1, volume 2, chapter 13, pages 775-826, Elsevier.
    24. Gwowen Shieh, 2000. "On Power and Sample Size Calculations for Likelihood Ratio Tests in Generalized Linear Models," Biometrics, The International Biometric Society, vol. 56(4), pages 1192-1196, December.
    25. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    26. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    27. Slonim, Robert & Wang, Carmen & Garbarino, Ellen & Merrett, Danielle, 2013. "Opting-in: Participation bias in economic experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 43-70.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jeffrey Penney, 2023. "Cautions when normalizing the dependent variable in a regression as a z‐score," Economic Inquiry, Western Economic Association International, vol. 61(2), pages 402-412, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    3. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    4. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    5. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    6. Tremewan, James & Vanberg, Christoph, 2018. "Voting rules in multilateral bargaining: using an experiment to relax procedural assumptions," Working Papers 0651, University of Heidelberg, Department of Economics.
    7. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.
    8. So, Tony, 2020. "Classroom experiments as a replication device," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    9. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    10. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    11. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    12. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    13. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    14. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    15. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    16. Grüner Sven, 2020. "Sample Size Calculation in Economic Experiments," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 240(6), pages 791-823, December.
    17. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    18. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    19. Javdani, Moshen & Chang, Ha-Joon, 2019. "Who Said or What Said? Estimating Ideological Bias in Views Among Economists," MPRA Paper 91958, University Library of Munich, Germany.
    20. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jesaex:v:6:y:2020:i:2:d:10.1007_s40881-020-00089-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.