IDEAS home Printed from https://ideas.repec.org/p/ifs/ifsewp/15-17.html
   My bibliography  Save this paper

Going beyond simple sample size calculations: a practitioner's guide

Author

Listed:
  • Brendon McConnell

    (Institute for Fiscal Studies)

  • Marcos Vera-Hernandez

    (Institute for Fiscal Studies and University College London)

Abstract

Basic methods to compute required sample sizes are well understood and supported by widely available software. However, the sophistication of the methods commonly used has not kept pace with the complexity of commonly employed experimental designs. We compile available methods for sample size calculations for continuous and binary outcomes with and without covariates, for both clustered and non-clustered RCTs. Formulae for both panel data and unbalanced designs are provided. Extensions include methods to: (1) optimise the sample when costs constraints are binding, (2) compute the power of a complex design by simulation, and (3) adjust calculations for multiple testing. View accompanying sample size calculators for this paper.

Suggested Citation

  • Brendon McConnell & Marcos Vera-Hernandez, 2015. "Going beyond simple sample size calculations: a practitioner's guide," IFS Working Papers W15/17, Institute for Fiscal Studies.
  • Handle: RePEc:ifs:ifsewp:15/17
    as

    Download full text from publisher

    File URL: https://www.ifs.org.uk/uploads/publications/wps/WP201517_update_Sep15.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Anderson, Michael L, 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Department of Agricultural & Resource Economics, UC Berkeley, Working Paper Series qt15n8j26f, Department of Agricultural & Resource Economics, UC Berkeley.
    2. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    3. Kenneth F Schulz & Douglas G Altman & David Moher & for the CONSORT Group, 2010. "CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-7, March.
    4. A. H. Feiveson, 2002. "Power by simulation," Stata Journal, StataCorp LP, vol. 2(2), pages 107-124, May.
    5. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    6. Pedro Carneiro & Rita Ginja, 2014. "Long-Term Impacts of Compensatory Preschool on Health and Behavior: Evidence from Head Start," American Economic Journal: Economic Policy, American Economic Association, vol. 6(4), pages 135-173, November.
    7. Richard Hooper, 2013. "Versatile sample-size calculation using simulation," Stata Journal, StataCorp LP, vol. 13(1), pages 21-38, March.
    8. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    9. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    10. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    11. repec:mpr:mprres:6371 is not listed on IDEAS
    12. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    13. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    14. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    15. repec:feb:artefa:0087 is not listed on IDEAS
    16. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    17. McKenzie, David, 2012. "Beyond baseline and follow-up: The case for more T in experiments," Journal of Development Economics, Elsevier, vol. 99(2), pages 210-221.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Singh, Prakarsh & Mitra, Sandip, 2017. "Incentives, information and malnutrition: Evidence from an experiment in India," European Economic Review, Elsevier, vol. 93(C), pages 24-46.
    2. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    3. Bedoya Arguelles,Guadalupe & Bittarello,Luca & Davis,Jonathan Martin Villars & Mittag,Nikolas Karl & Bedoya Arguelles,Guadalupe & Bittarello,Luca & Davis,Jonathan Martin Villars & Mittag,Nikolas Karl, 2017. "Distributional impact analysis: toolkit and illustrations of impacts beyond the average treatment effect," Policy Research Working Paper Series 8139, The World Bank.
    4. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    5. Timothy Gubler & Ian Larkin & Lamar Pierce, 2018. "Doing Well by Making Well: The Impact of Corporate Wellness Programs on Employee Productivity," Management Science, INFORMS, vol. 64(11), pages 4967-4987, November.
    6. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    7. Bernal, Pedro & Martinez, Sebastian, 2020. "In-kind incentives and health worker performance: Experimental evidence from El Salvador," Journal of Health Economics, Elsevier, vol. 70(C).
    8. Marianna Battaglia & Lara Lebedinski, 2017. "The curse of low expectations," The Economics of Transition, The European Bank for Reconstruction and Development, vol. 25(4), pages 681-721, October.
    9. Orazio Attanasio & Matthew Bird & Lina Cardona-Sosa & Pablo Lavado, 2019. "Freeing Financial Education via Tablets: Experimental Evidence from Colombia," NBER Working Papers 25929, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    3. Yitayew, Asresu & Abdulai, Awudu & Yigezu, Yigezu A. & Deneke, Tilaye T. & Kassie, Girma T., 2021. "Impact of agricultural extension services on the adoption of improved wheat variety in Ethiopia: A cluster randomized controlled trial," World Development, Elsevier, vol. 146(C).
    4. Brown, Annette N. & Wood, Benjamin Douglas Kuflick, 2018. "Which tests not witch hunts: A diagnostic approach for conducting replication research," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-26.
    5. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    6. Apps, Patricia & Mendolia, Silvia & Walker, Ian, 2013. "The impact of pre-school on adolescents’ outcomes: Evidence from a recent English cohort," Economics of Education Review, Elsevier, vol. 37(C), pages 183-199.
    7. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    8. Marco Gonzalez-Navarro & Climent Quintana-Domeque, 2010. "Street Pavement: Results from an Infrastructure Experiment in Mexico," Working Papers 1247, Princeton University, Department of Economics, Industrial Relations Section..
    9. Guigonan S. Adjognon & Daan van Soest & Jonas Guthoff, 2021. "Reducing Hunger with Payments for Environmental Services (PES): Experimental Evidence from Burkina Faso," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(3), pages 831-857, May.
    10. Adjognon,Guigonan Serge & Nguyen Huy,Tung & Guthoff,Jonas Christoph & van Soest,Daan, 2022. "Incentivizing Social Learning for the Diffusion of Climate-Smart Agricultural Techniques," Policy Research Working Paper Series 10041, The World Bank.
    11. Mattoo, Aaditya & Cadot, Olivier & Gourdon, Julien & Fernandes, Ana Margarida, 2011. "Impact Evaluation of Trade Interventions: Paving the Way," CEPR Discussion Papers 8638, C.E.P.R. Discussion Papers.
    12. Marco Gonzalez-Navarro & Climent Quintana-Domeque, 2010. "Street Pavement: Results from an Infrastructure Experiment in Mexico," Working Papers 1247, Princeton University, Department of Economics, Industrial Relations Section..
    13. Bandiera, Oriana. & Buehren, Niklas. & Burgess, Robin & Goldstein, Markus P., & Gulesci, Selim. & Rasul, Imran. & Sulaiman, Munshi., 2015. "Women’s economic empowerment in action : evidence from a randomized control trial in Africa," ILO Working Papers 994874053402676, International Labour Organization.
    14. Salauddin Tauseef, 2022. "The Importance of Nutrition Education in Achieving Food Security and Adequate Nutrition of the Poor: Experimental Evidence from Bangladesh," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 84(1), pages 241-271, February.
    15. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    16. Omotilewa, Oluwatoba J. & Ricker-Gilbert, Jacob & Ainembabazi, John Herbert & Shively, Gerald E., 2018. "Does improved storage technology promote modern input use and food security? Evidence from a randomized trial in Uganda," Journal of Development Economics, Elsevier, vol. 135(C), pages 176-198.
    17. Steinert, Janina Isabel & Cluver, Lucie Dale & Meinck, Franziska & Doubt, Jenny & Vollmer, Sebastian, 2018. "Household economic strengthening through financial and psychosocial programming: Evidence from a field experiment in South Africa," Journal of Development Economics, Elsevier, vol. 134(C), pages 443-466.
    18. Schneider, Hilmar & Uhlendorff, Arne & Zimmermann, Klaus F., 2010. "Mit Workfare aus der Sozialhilfe? Lehren aus einem Modellprojekt," IZA Standpunkte 33, Institute of Labor Economics (IZA).
    19. Andrés Ham & Darío Maldonado & Michael Weintraub & Andrés Felipe Camacho & Daniela Gualtero, 2022. "Reducing Alcohol‐Related Violence with Bartenders: A Behavioral Field Experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(3), pages 731-761, June.
    20. Andrés Ham & Darío Maldonado & Michael Weintraub & Andrés Felipe Camacho & Daniela Gualtero, 2019. "Reducing Alcohol-Related Violence: A Field Experiment with Bartenders," Documentos de trabajo 17834, Escuela de Gobierno - Universidad de los Andes.

    More about this item

    Keywords

    Power analysis; Sample size calculations; Randomised Control Trials; Cluster Randomised Control Trial; Covariates; Cost Minimisation; Multiple outcomes; Simulation;
    All these keywords.

    JEL classification:

    • C8 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ifs:ifsewp:15/17. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Emma Hyman (email available below). General contact details of provider: https://edirc.repec.org/data/ifsssuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.