IDEAS home Printed from https://ideas.repec.org/p/wbk/wbrwps/4752.html
   My bibliography  Save this paper

In pursuit of balance : randomization in practice in development field experiments

Author

Listed:
  • Bruhn, Miriam
  • McKenzie, David

Abstract

Randomized experiments are increasingly used in development economics, with researchers now facing the question of not just whether to randomize, but how to do so. Pure random assignment guarantees that the treatment and control groups will have identical characteristics on average, but in any particular random allocation, the two groups will differ along some dimensions. Methods used to pursue greater balance include stratification, pair-wise matching, and re-randomization. This paper presents new evidence on the randomization methods used in existing randomized experiments, and carries out simulations in order to provide guidance for researchers. Three main results emerge. First, many researchers are not controlling for the method of randomization in their analysis. The authors show this leads to tests with incorrect size, and can result in lower power than if a pure random draw was used. Second, they find that in samples of 300 or more, the different randomization methods perform similarly in terms of achieving balance on many future outcomes of interest. However, for very persistent outcome variables and in smaller sample sizes, pair-wise matching and stratification perform best. Third, the analysis suggests that on balance the re-randomization methods common in practice are less desirable than other methods, such as matching.

Suggested Citation

  • Bruhn, Miriam & McKenzie, David, 2008. "In pursuit of balance : randomization in practice in development field experiments," Policy Research Working Paper Series 4752, The World Bank.
  • Handle: RePEc:wbk:wbrwps:4752
    as

    Download full text from publisher

    File URL: http://www-wds.worldbank.org/external/default/WDSContentServer/WDSP/IB/2008/10/15/000158349_20081015133501/Rendered/PDF/WPS4752.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Esther Duflo & Rema Hanna, 2005. "Monitoring Works: Getting Teachers to Come to School," Working Papers id:301, eSocialSciences.
    2. Abhijit Vinayak Banerjee & Alice H. Amsden & Robert H. Bates & Jagdish Bhagwati & Angus Deaton & Nicholas Stern, 2007. "Making Aid Work," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262026155, February.
    3. Abhijit V. Banerjee & Shawn Cole & Esther Duflo & Leigh Linden, 2007. "Remedying Education: Evidence from Two Randomized Experiments in India," The Quarterly Journal of Economics, Oxford University Press, vol. 122(3), pages 1235-1264.
    4. Tahir Andrabi & Jishnu Das & Asim Ijaz Khwaja & Tristan Zajonc, 2011. "Do Value-Added Estimates Add Value? Accounting for Learning Dynamics," American Economic Journal: Applied Economics, American Economic Association, vol. 3(3), pages 29-54, July.
    5. Dean Karlan & Martin Valdivia, 2011. "Teaching Entrepreneurship: Impact of Business Training on Microfinance Clients and Institutions," The Review of Economics and Statistics, MIT Press, vol. 93(2), pages 510-527, May.
    6. Martina Björkman & Jakob Svensson, 2009. "Power to the People: Evidence from a Randomized Field Experiment on Community-Based Monitoring in Uganda," The Quarterly Journal of Economics, Oxford University Press, vol. 124(2), pages 735-769.
    7. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    8. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    9. Ashraf Nava & Karlan Dean & Yin Wesley, 2006. "Deposit Collectors," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-24, March.
    10. Paul Glewwe & Albert Park & Meng Zhao, 2006. "The impact of eyeglasses on the academic performance of primary school students: Evidence from a randomized trial in rural china," Natural Field Experiments 00254, The Field Experiments Website.
    11. Claudio Ferraz & Frederico Finan, 2008. "Exposing Corrupt Politicians: The Effects of Brazil's Publicly Released Audits on Electoral Outcomes," The Quarterly Journal of Economics, Oxford University Press, vol. 123(2), pages 703-745.
    12. Nava Ashraf & Dean Karlan & Wesley Yin, 2006. "Tying Odysseus to the Mast: Evidence From a Commitment Savings Product in the Philippines," The Quarterly Journal of Economics, Oxford University Press, vol. 121(2), pages 635-672.
    13. Suresh de Mel & David McKenzie & Christopher Woodruff, 2009. "Returns to Capital in Microenterprises: Evidence from a Field Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 124(1), pages 423-423.
    14. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    15. Erica Field & Rohini Pande, 2008. "Repayment Frequency and Default in Microfinance: Evidence From India," Journal of the European Economic Association, MIT Press, vol. 6(2-3), pages 501-509, 04-05.
    16. Nava Ashraf & James Berry & Jesse M. Shapiro, 2010. "Can Higher Prices Stimulate Product Use? Evidence from a Field Experiment in Zambia," American Economic Review, American Economic Association, vol. 100(5), pages 2383-2413, December.
    17. repec:feb:artefa:0087 is not listed on IDEAS
    18. Michael Kremer, 2003. "Randomized Evaluations of Educational Programs in Developing Countries: Some Lessons," American Economic Review, American Economic Association, vol. 93(2), pages 102-106, May.
    19. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    20. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    21. Benjamin A. Olken, 2007. "Monitoring Corruption: Evidence from a Field Experiment in Indonesia," Journal of Political Economy, University of Chicago Press, vol. 115, pages 200-249.
    22. Marianne Bertrand & Simeon Djankov & Rema Hanna & Sendhil Mullainathan, 2007. "Obtaining a Driver's License in India: An Experimental Approach to Studying Corruption," The Quarterly Journal of Economics, Oxford University Press, vol. 122(4), pages 1639-1676.
    23. Gustavo J. Bobonis & Edward Miguel & Charu Puri-Sharma, 2006. "Anemia and School Participation," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    24. Kosuke Imai & Gary King & Elizabeth A. Stuart, 2008. "Misunderstandings between experimentalists and observationalists about causal inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 171(2), pages 481-502, April.
    25. Skoufias, Emmanuel, 2005. "PROGRESA and its impacts on the welfare of rural households in Mexico:," Research reports 139, International Food Policy Research Institute (IFPRI).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    2. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    3. Benjamin A. Olken, 2020. "Banerjee, Duflo, Kremer, and the Rise of Modern Development Economics," Scandinavian Journal of Economics, Wiley Blackwell, vol. 122(3), pages 853-878, July.
    4. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    5. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    6. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    7. David K. Evans & Arkadipta Ghosh, 2008. "Prioritizing Educational Investments in Children in the Developing World," Working Papers 587, RAND Corporation.
    8. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    9. Abhijit V. Banerjee & Esther Duflo, 2010. "Giving Credit Where It Is Due," Journal of Economic Perspectives, American Economic Association, vol. 24(3), pages 61-80, Summer.
    10. McKenzie, David, 2012. "Beyond baseline and follow-up: The case for more T in experiments," Journal of Development Economics, Elsevier, vol. 99(2), pages 210-221.
    11. David K. Evans & Arkadipta Ghosh, 2008. "Prioritizing Educational Investments in Children in the Developing World," Working Papers WR-587, RAND Corporation.
    12. World Bank, 2011. "Indonesia's PNPM Generasi Program : Final Impact Evaluation Report," World Bank Other Operational Studies 21595, The World Bank.
    13. Patrick J. McEwan, 2012. "Cost-effectiveness analysis of education and health interventions in developing countries," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 4(2), pages 189-213, June.
    14. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    15. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," World Bank Research Observer, World Bank Group, vol. 33(1), pages 34-64.
    16. Temple, Jonathan R.W., 2010. "Aid and Conditionality," Handbook of Development Economics, in: Dani Rodrik & Mark Rosenzweig (ed.), Handbook of Development Economics, edition 1, volume 5, chapter 0, pages 4415-4523, Elsevier.
    17. Eduard Marinov, 2019. "The 2019 Nobel Prize in Economics," Economic Thought journal, Bulgarian Academy of Sciences - Economic Research Institute, issue 6, pages 78-116.
    18. Sylvain Chassang & Gerard Padro I Miquel & Erik Snowberg, 2012. "Selective Trials: A Principal-Agent Approach to Randomized Controlled Experiments," American Economic Review, American Economic Association, vol. 102(4), pages 1279-1309, June.
    19. Diether Beuermann & Maria Amelina, 2014. "Does Participatory Budgeting Improve Decentralized Public Service Delivery?," IDB Publications (Working Papers) 87095, Inter-American Development Bank.
    20. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.

    More about this item

    Keywords

    Statistical&Mathematical Sciences; Scientific Research&Science Parks; Science Education; Economic Theory&Research; Climate Change;
    All these keywords.

    JEL classification:

    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • O12 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development - - - Microeconomic Analyses of Economic Development

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wbk:wbrwps:4752. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Roula I. Yazigi). General contact details of provider: https://edirc.repec.org/data/dvewbus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.