IDEAS home Printed from https://ideas.repec.org/p/tin/wpaper/20120081.html
   My bibliography  Save this paper

Evaluation of Development Programs: Using Regressions to assess the Impact of Complex Interventions

Author

Listed:
  • Chris Elbers

    (VU University Amsterdam)

  • Jan Willem Gunning

    (VU University Amsterdam)

Abstract

There is a growing interest in extending project evaluation methods to the evaluation of programs: complex interventions involving multiple activities. In general a program evaluation cannot be based on separate evaluations of its components since interactions between the activities are likely to be important. We propose a measure of program impact, the total program effect (TPE), which is an extension of the average treatment effect on the treated (ATET). Regression techniques can be applied to observational data from a representative sample to estimate the TPE for complex interventions in the presence of selection effects and treatment heterogeneity. As an example we present an estimate of the TPE for a rural water supply and sanitation program in Mozambique. Estimating the TPE from randomized controlled trials would appear to be an alternative; however, the scope for using RCTs in this context is limited. See also 'Evaluation of Development Programs: Randomized Controlled Trials or Regressions?' in 'The World Bank Economic Review' (2014), 28(3), 432-445.

Suggested Citation

  • Chris Elbers & Jan Willem Gunning, 2012. "Evaluation of Development Programs: Using Regressions to assess the Impact of Complex Interventions," Tinbergen Institute Discussion Papers 12-081/2, Tinbergen Institute.
  • Handle: RePEc:tin:wpaper:20120081
    as

    Download full text from publisher

    File URL: https://papers.tinbergen.nl/12081.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. Chen, Shaohua & Mu, Ren & Ravallion, Martin, 2009. "Are there lasting impacts of aid to poor areas?," Journal of Public Economics, Elsevier, vol. 93(3-4), pages 512-528, April.
    3. Elbers, Chris & Gunning, Jan Willem & de Hoop, Kobus, 2009. "Assessing Sector-wide Programs with Statistical Impact Evaluation: A Methodological Proposal," World Development, Elsevier, vol. 37(2), pages 513-520, February.
    4. Chris Elbers & Samuel Godfrey & Jan Willem Gunning & Matteus van der Velden & Melinda Vigh, 2012. "Effectiveness of Large Scale Water and Sanitation Interventions: the One Million Initiative in Mozambique," Tinbergen Institute Discussion Papers 12-069/2, Tinbergen Institute.
    5. Martin Ravallion, 2012. "Fighting Poverty One Experiment at a Time: Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty : Review Essay," Journal of Economic Literature, American Economic Association, vol. 50(1), pages 103-114, March.
    6. repec:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010 is not listed on IDEAS
    7. Martin Ravallion, 2009. "Evaluation in the Practice of Development," The World Bank Research Observer, World Bank, vol. 24(1), pages 29-53, March.
    8. Jishnu Das & Stefan Dercon & James Habyarimana & Pramila Krishnan, 2007. "Teacher Shocks and Student Learning: Evidence from Zambia," Journal of Human Resources, University of Wisconsin Press, vol. 42(4).
    9. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    10. Rodrik, Dani, 2008. "The New Development Economics: We Shall Experiment, but How Shall We Learn?," Working Paper Series rwp08-055, Harvard University, John F. Kennedy School of Government.
    11. James Heckman, 1997. "Instrumental Variables: A Study of Implicit Behavioral Assumptions Used in Making Program Evaluations," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 441-462.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jan Willem GUNNING, 2012. "How Can Development NGOs Be Evaluated ?," Working Papers P51, FERDI.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chris Elbers & Jan Willem Gunning, 2014. "Evaluation of Development Programs: Randomized Controlled Trials or Regressions?," The World Bank Economic Review, World Bank, vol. 28(3), pages 432-445.
    2. Chris Elbers & Jan Willem Gunning, 2009. "Evaluation of Development Policy: Treatment versus Program Effects," Tinbergen Institute Discussion Papers 09-073/2, Tinbergen Institute.
    3. Donovan, Kevin P., 2018. "The rise of the randomistas: on the experimental turn in international aid," SocArXiv xygzb, Center for Open Science.
    4. Onur Altindag & Theodore J. Joyce & Julie A. Reeder, 2015. "Effects of Peer Counseling to Support Breastfeeding: Assessing the External Validity of a Randomized Field Experiment," NBER Working Papers 21013, National Bureau of Economic Research, Inc.
    5. Elbers, Chris & Gunning, Jan Willem, 2014. "Evaluation of non-governmental development organizations," WIDER Working Paper Series 026, World Institute for Development Economic Research (UNU-WIDER).
    6. Chris Elbers & Jan Willem Gunning, 2014. "Evaluation of Non-Governmental Development Organizations," WIDER Working Paper Series wp-2014-026, World Institute for Development Economic Research (UNU-WIDER).
    7. Temple, Jonathan R.W., 2010. "Aid and Conditionality," Handbook of Development Economics, in: Dani Rodrik & Mark Rosenzweig (ed.), Handbook of Development Economics, edition 1, volume 5, chapter 0, pages 4415-4523, Elsevier.
    8. Ayako Wakano & Hiroyuki Yamada & Daichi Shimamoto, 2017. "Does the Heterogeneity of Project Implementers Affect the Programme Participation of Beneficiaries?: Evidence from Rural Cambodia," Journal of Development Studies, Taylor & Francis Journals, vol. 53(1), pages 49-67, January.
    9. Martin Ravallion, 2012. "Fighting Poverty One Experiment at a Time: Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty : Review Essay," Journal of Economic Literature, American Economic Association, vol. 50(1), pages 103-114, March.
    10. Hammer, Jeffrey & Spears, Dean, 2016. "Village sanitation and child health: Effects and external validity in a randomized field experiment in rural India," Journal of Health Economics, Elsevier, vol. 48(C), pages 135-148.
    11. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    12. Ralitza Dimova, 2019. "A Debate that Fatigues…: To Randomise or Not to Randomise; What’s the Real Question?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 31(2), pages 163-168, April.
    13. Basu, Kaushik, 2013. "The method of randomization and the role of reasoned intuition," Policy Research Working Paper Series 6722, The World Bank.
    14. Yoshino, Naoyuki & Abidhadjaev, Umid, 2017. "An impact evaluation of investment in infrastructure: The case of a railway connection in Uzbekistan," Journal of Asian Economics, Elsevier, vol. 49(C), pages 1-11.
    15. Yoshino, Naoyuki & Abidhadjaev, Umid, 2016. "Impact of Infrastructure Investment on Tax: Estimating Spillover Effects of the Kyushu High-Speed Rail Line in Japan on Regional Tax Revenue," ADBI Working Papers 574, Asian Development Bank Institute.
    16. Van Klaveren, C. & De Wolf, I., 2013. "Systematic Reviews In Education Research: When Do Effect Studies Provide Evidence?," Working Papers 46, Top Institute for Evidence Based Education Research.
    17. Florent BEDECARRATS & Isabelle GUERIN & François ROUBAUD, 2017. "L'étalon-or des évaluations randomisées : économie politique des expérimentations aléatoires dans le domaine du développement," Working Paper 753120cd-506f-4c5f-80ed-7, Agence française de développement.
    18. Smith, Lisa C. & Khan, Faheem & Frankenberger, Timothy R. & Wadud, A.K.M. Abdul, 2013. "Admissible Evidence in the Court of Development Evaluation? The Impact of CARE’s SHOUHARDO Project on Child Stunting in Bangladesh," World Development, Elsevier, vol. 41(C), pages 196-216.
    19. Ashish Arora & Michelle Gittelman & Sarah Kaplan & John Lynch & Will Mitchell & Nicolaj Siggelkow & Aaron K. Chatterji & Michael Findley & Nathan M. Jensen & Stephan Meier & Daniel Nielson, 2016. "Field experiments in strategy research," Strategic Management Journal, Wiley Blackwell, vol. 37(1), pages 116-132, January.
    20. Jacobus de Hoop & Furio C. Rosati, 2014. "Cash Transfers and Child Labor," The World Bank Research Observer, World Bank, vol. 29(2), pages 202-234.

    More about this item

    Keywords

    program evaluation; randomized controlled trials; policy evaluation; treatment heterogeneity; budget support; sector-wide programs; aid effectiveness;
    All these keywords.

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C33 - Mathematical and Quantitative Methods - - Multiple or Simultaneous Equation Models; Multiple Variables - - - Models with Panel Data; Spatio-temporal Models
    • O22 - Economic Development, Innovation, Technological Change, and Growth - - Development Planning and Policy - - - Project Analysis

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tin:wpaper:20120081. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Tinbergen Office +31 (0)10-4088900 (email available below). General contact details of provider: https://edirc.repec.org/data/tinbenl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.