IDEAS home Printed from https://ideas.repec.org/a/kap/expeco/v26y2023i2d10.1007_s10683-022-09773-8.html
   My bibliography  Save this article

Improving the statistical power of economic experiments using adaptive designs

Author

Listed:
  • Sebastian Jobjörnsson

    (University Medical Center Göttingen)

  • Henning Schaak

    (University of Natural Resources and Life Sciences)

  • Oliver Musshoff

    (Georg-August-Universität Göttingen)

  • Tim Friede

    (University Medical Center Göttingen)

Abstract

An important issue for many economic experiments is how the experimenter can ensure sufficient power in order to reject one or more hypotheses. The paper illustrates how methods for testing multiple hypotheses simultaneously in adaptive, two-stage designs can be used to improve the power of economic experiments. We provide a concise overview of the relevant theory and illustrate the method in three different applications. These include a simulation study of a hypothetical experimental design, as well as illustrations using two data sets from previous experiments. The simulation results highlight the potential for sample size reductions, maintaining the power to reject at least one hypothesis while ensuring strong control of the overall Type I error probability.

Suggested Citation

  • Sebastian Jobjörnsson & Henning Schaak & Oliver Musshoff & Tim Friede, 2023. "Improving the statistical power of economic experiments using adaptive designs," Experimental Economics, Springer;Economic Science Association, vol. 26(2), pages 357-382, April.
  • Handle: RePEc:kap:expeco:v:26:y:2023:i:2:d:10.1007_s10683-022-09773-8
    DOI: 10.1007/s10683-022-09773-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10683-022-09773-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10683-022-09773-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Stephen T. Ziliak & Deirdre N. McCloskey, 2004. "Size Matters: The Standard Error of Regressions in the American Economic Review," Econ Journal Watch, Econ Journal Watch, vol. 1(2), pages 331-358, August.
    2. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    3. Nikhil Bhat & Vivek F. Farias & Ciamac C. Moallemi & Deeksha Sinha, 2020. "Near-Optimal A-B Testing," Management Science, INFORMS, vol. 66(10), pages 4477-4495, October.
    4. Walter Lehmacher & Gernot Wassmer, 1999. "Adaptive Sample Size Calculations in Group Sequential Trials," Biometrics, The International Biometric Society, vol. 55(4), pages 1286-1290, December.
    5. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    6. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    7. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2016. "Simulating power of economic experiments: the powerBBK package," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 2(2), pages 157-168, November.
    8. Brennan S Thompson & Matthew D Webb, 2019. "A simple, graphical approach to comparing multiple treatments," The Econometrics Journal, Royal Economic Society, vol. 22(2), pages 188-205.
    9. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    10. Oliver Musshoff & Norbert Hirschauer, 2014. "Using business simulation games in regulatory impact analysis - the case of policies aimed at reducing nitrogen leaching," Applied Economics, Taylor & Francis Journals, vol. 46(25), pages 3049-3060, September.
    11. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    12. Maximilian Kasy & Anja Sautmann, 2021. "Adaptive Treatment Assignment in Experiments for Policy Choice," Econometrica, Econometric Society, vol. 89(1), pages 113-132, January.
    13. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2021. "Editorial favoritism in the field of laboratory experimental economics (RM/20/014-revised-)," Research Memorandum 005, Maastricht University, Graduate School of Business and Economics (GSBE).
    3. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    4. Leah H. Palm-Forster & Paul J. Ferraro & Nicholas Janusch & Christian A. Vossler & Kent D. Messer, 2019. "Behavioral and Experimental Agri-Environmental Research: Methodological Challenges, Literature Gaps, and Recommendations," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 719-742, July.
    5. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    6. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    7. Weili Ding, 2020. "Laboratory experiments can pre-design to address power and selection issues," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 125-138, December.
    8. Black, Bernard & Hollingsworth, Alex & Nunes, Letícia & Simon, Kosali, 2022. "Simulated power analyses for observational studies: An application to the Affordable Care Act Medicaid expansion," Journal of Public Economics, Elsevier, vol. 213(C).
    9. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    10. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2019. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics," Research Memorandum 029, Maastricht University, Graduate School of Business and Economics (GSBE).
    11. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.
    12. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    13. Campbell R. Harvey & Yan Liu, 2020. "False (and Missed) Discoveries in Financial Economics," Papers 2006.04269, arXiv.org.
    14. Vladasel, Theodor & Parker, Simon C. & Sloof, Randolph & van Praag, Mirjam C., 2022. "Revenue Drift, Incentives, and Effort Allocation in Social Enterprises," IZA Discussion Papers 15716, Institute of Labor Economics (IZA).
    15. Campbell R. Harvey & Yan Liu, 2020. "False (and Missed) Discoveries in Financial Economics," Journal of Finance, American Finance Association, vol. 75(5), pages 2503-2553, October.
    16. Ek, Claes, 2017. "Some causes are more equal than others? The effect of similarity on substitution in charitable giving," Journal of Economic Behavior & Organization, Elsevier, vol. 136(C), pages 45-62.
    17. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    18. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    19. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    20. Grüner Sven, 2020. "Sample Size Calculation in Economic Experiments," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 240(6), pages 791-823, December.

    More about this item

    Keywords

    Adaptive design; Multiple testing; Simulation study; Family-wise error rate; Experimental design;
    All these keywords.

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:kap:expeco:v:26:y:2023:i:2:d:10.1007_s10683-022-09773-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.