IDEAS home Printed from https://ideas.repec.org/p/tin/wpaper/20090073.html
   My bibliography  Save this paper

Evaluation of Development Policy: Treatment versus Program Effects

Author

Listed:
  • Chris Elbers

    (VU University Amsterdam)

  • Jan Willem Gunning

    (VU University Amsterdam)

Abstract

There is a growing interest, notably in development economics, in extending project evaluation methods to the evaluation of multiple interventions (“programs”). In program evaluations one is interested in the aggregate impact of a program rather than the effect on individual beneficiaries. In many situations randomized controlled trials cannot identify this impact. We propose a measure of program impact, the total program effect (TPE), which is a generalization of the treatment effect on the treated (ATET). We show how the TPE can be estimated.

Suggested Citation

  • Chris Elbers & Jan Willem Gunning, 2009. "Evaluation of Development Policy: Treatment versus Program Effects," Tinbergen Institute Discussion Papers 09-073/2, Tinbergen Institute.
  • Handle: RePEc:tin:wpaper:20090073
    as

    Download full text from publisher

    File URL: https://papers.tinbergen.nl/09073.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Elbers, Chris & Gunning, Jan Willem & de Hoop, Kobus, 2009. "Assessing Sector-wide Programs with Statistical Impact Evaluation: A Methodological Proposal," World Development, Elsevier, vol. 37(2), pages 513-520, February.
    2. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    3. Martin Ravallion, 2009. "Evaluation in the Practice of Development," The World Bank Research Observer, World Bank, vol. 24(1), pages 29-53, March.
    4. Jishnu Das & Stefan Dercon & James Habyarimana & Pramila Krishnan, 2007. "Teacher Shocks and Student Learning: Evidence from Zambia," Journal of Human Resources, University of Wisconsin Press, vol. 42(4).
    5. Rodrik, Dani, 2008. "The New Development Economics: We Shall Experiment, but How Shall We Learn?," Working Paper Series rwp08-055, Harvard University, John F. Kennedy School of Government.
    6. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    7. Martin Ravallion & Emanuela Galasso & Teodoro Lazo & Ernesto Philipp, 2005. "What Can Ex-Participants Reveal about a Program’s Impact?," Journal of Human Resources, University of Wisconsin Press, vol. 40(1).
    8. James Heckman, 1997. "Instrumental Variables: A Study of Implicit Behavioral Assumptions Used in Making Program Evaluations," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 441-462.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chris Elbers & Jan Willem Gunning, 2014. "Evaluation of Development Programs: Randomized Controlled Trials or Regressions?," The World Bank Economic Review, World Bank, vol. 28(3), pages 432-445.
    2. Chris Elbers & Jan Willem Gunning, 2012. "Evaluation of Development Programs: Using Regressions to assess the Impact of Complex Interventions," Tinbergen Institute Discussion Papers 12-081/2, Tinbergen Institute.
    3. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    4. Öberg, Stefan, 2018. "Instrumental variables based on twin births are by definition not valid (v.3.0)," SocArXiv zux9s, Center for Open Science.
    5. Breen, Richard & Ermisch, John, 2021. "Instrumental Variable Estimation in Demographic Studies: The LATE interpretation of the IV estimator with heterogenous effects," SocArXiv vx9m7, Center for Open Science.
    6. David McKenzie, 2010. "Impact Assessments in Finance and Private Sector Development: What Have We Learned and What Should We Learn?," The World Bank Research Observer, World Bank, vol. 25(2), pages 209-233, August.
    7. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    8. Ralitza Dimova, 2019. "A Debate that Fatigues…: To Randomise or Not to Randomise; What’s the Real Question?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 31(2), pages 163-168, April.
    9. Van Klaveren, C. & De Wolf, I., 2013. "Systematic Reviews In Education Research: When Do Effect Studies Provide Evidence?," Working Papers 46, Top Institute for Evidence Based Education Research.
    10. Dionissi Aliprantis, 2013. "Covariates and causal effects: the problem of context," Working Papers (Old Series) 1310, Federal Reserve Bank of Cleveland.
    11. Rinku Murgai & Martin Ravallion & Dominique van de Walle, 2016. "Is Workfare Cost-effective against Poverty in a Poor Labor-Surplus Economy?," The World Bank Economic Review, World Bank, vol. 30(3), pages 413-445.
    12. Kyui, Natalia, 2016. "Expansion of higher education, employment and wages: Evidence from the Russian Transition," Labour Economics, Elsevier, vol. 39(C), pages 68-87.
    13. Guilhem Bascle, 2008. "Controlling for endogeneity with instrumental variables in strategic management research," Post-Print hal-00576795, HAL.
    14. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2015. "The gold standard for randomized evaluations: from discussion of method to political economy," Working Papers DT/2015/01, DIAL (Développement, Institutions et Mondialisation).
    15. Huber Martin & Wüthrich Kaspar, 2019. "Local Average and Quantile Treatment Effects Under Endogeneity: A Review," Journal of Econometric Methods, De Gruyter, vol. 8(1), pages 1-27, January.
    16. David K. Evans & Anna Popova, 2016. "What Really Works to Improve Learning in Developing Countries? An Analysis of Divergent Findings in Systematic Reviews," The World Bank Research Observer, World Bank, vol. 31(2), pages 242-270.
    17. Elbers, Chris & Gunning, Jan Willem, 2014. "Evaluation of non-governmental development organizations," WIDER Working Paper Series 026, World Institute for Development Economic Research (UNU-WIDER).
    18. Schennach, Susanne & White, Halbert & Chalak, Karim, 2012. "Local indirect least squares and average marginal effects in nonseparable structural systems," Journal of Econometrics, Elsevier, vol. 166(2), pages 282-302.
    19. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    20. Florent Bedecarrats & Isabelle Guérin & François Roubaud, 2017. "L'étalon-or des évaluations randomisées : du discours de la méthode à l'économie politique," Working Papers ird-01445209, HAL.

    More about this item

    Keywords

    program evaluation; randomized controlled trials; policy evaluation; treatment heterogeneity; budget support; sector-wide programs; aid effectiveness;
    All these keywords.

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C33 - Mathematical and Quantitative Methods - - Multiple or Simultaneous Equation Models; Multiple Variables - - - Models with Panel Data; Spatio-temporal Models
    • O22 - Economic Development, Innovation, Technological Change, and Growth - - Development Planning and Policy - - - Project Analysis

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tin:wpaper:20090073. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Tinbergen Office +31 (0)10-4088900 (email available below). General contact details of provider: https://edirc.repec.org/data/tinbenl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.