IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Log in (now much improved!) to save this book chapter

Evaluating Anti-Poverty Programs

  • Ravallion, Martin

The chapter critically reviews the methods available for the ex post counterfactual analysis of programs that are assigned exclusively to individuals, households or locations. The emphasis is on the problems encountered in applying these methods to anti-poverty programs in developing countries, drawing on examples from actual evaluations. Two main lessons emerge. Firstly, despite the claims of advocates, no single method dominates; rigorous, policy-relevant evaluations should be open-minded about methodology, adapting to the problem, setting and data constraints. Secondly, future efforts to draw useful lessons from evaluations call for more policy-relevant data and methods than used in the classic assessment of mean impact for those assigned to the program.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL: http://www.sciencedirect.com/science/article/B7P5D-4RWXCH1-P/1/2c7e700530cdc01e6774950b0913e658
Download Restriction: Full text for ScienceDirect subscribers only

As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

as
in new window

This chapter was published in:
  • T. Paul Schultz & John A. Strauss (ed.), 2008. "Handbook of Development Economics," Handbook of Development Economics, Elsevier, edition 1, volume 4, number 5, January.
  • This item is provided by Elsevier in its series Handbook of Development Economics with number 5-59.
    Handle: RePEc:eee:devchp:5-59
    Contact details of provider: Web page: http://www.elsevier.com/wps/find/bookseriesdescription.cws_home/BS_HE/description

    References listed on IDEAS
    Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

    as in new window
    1. Paul Glewwe & Michael Kremer & Sylvie Moulin & Eric Zitzewitz, 2000. "Retrospective vs. Prospective Analyses of School Inputs: The Case of Flip Charts in Kenya," NBER Working Papers 8018, National Bureau of Economic Research, Inc.
    2. White, Howard, 2009. "Theory-Based Impact Evaluation," 3ie Publications 2009-3, International Initiative for Impact Evaluation (3ie).
    3. Marianne Bertrand & Esther Duflo & Sendhil Mullainathan, 2002. "How Much Should We Trust Differences-in-Differences Estimates?," NBER Working Papers 8841, National Bureau of Economic Research, Inc.
    4. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    5. Elizabeth King & Eric Bettinger & Erik Bloom & Joshua Angrist & Michael Kremer, 2002. "Vouchers for private schooling in colombia: Evidence from a randomized natural experiment," Natural Field Experiments 00203, The Field Experiments Website.
    6. Jacoby, Hanan, 1997. "Is there an intrahousehold 'flypaper effect'?," FCND discussion papers 31, International Food Policy Research Institute (IFPRI).
    7. Emanuela Galasso & Martin Ravallion & Agustin Salvia, 2004. "Assisting the Transition from Workfare to Work: A Randomized Experiment," ILR Review, Cornell University, ILR School, vol. 58(1), pages 128-142, October.
    8. Stephen A. Woodbury, 2009. "Unemployment," Chapters, in: Labor and Employment Law and Economics, chapter 17 Edward Elgar Publishing.
    9. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097 Elsevier.
    10. Ravallion, Martin & Chen, Shaohua, 2005. "Hidden impact? Household saving in response to a poor-area development project," Journal of Public Economics, Elsevier, vol. 89(11-12), pages 2183-2204, December.
    11. Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
    12. Ravallion, 1999. "Monitoring targeting performance when decentralized allocation to the poor are unobserved," Policy Research Working Paper Series 2080, The World Bank.
    13. Jalan, Jyotsna & Ravallion, Martin, 2003. "Does piped water reduce diarrhea for children in rural India?," Journal of Econometrics, Elsevier, vol. 112(1), pages 153-173, January.
    14. Dubin, Jeffrey A. & Rivers, Douglas, 1993. "Experimental estimates of the impact of wage subsidies," Journal of Econometrics, Elsevier, vol. 56(1-2), pages 219-242, March.
    15. Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, SAGE Publishing, vol. 8(2), pages 225-246, April.
    16. Esther Duflo, 2000. "Schooling and Labor Market Consequences of School Construction in Indonesia: Evidence from an Unusual Policy Experiment," NBER Working Papers 7860, National Bureau of Economic Research, Inc.
    17. Emanuela Galasso & Martin Ravallion, 2004. "Social Protection in a Crisis: Argentina's Plan Jefes y Jefas," World Bank Economic Review, World Bank Group, vol. 18(3), pages 367-399.
    18. Mark M. Pitt & Shahidur R. Khandker, 1998. "The Impact of Group-Based Credit Programs on Poor Households in Bangladesh: Does the Gender of Participants Matter?," Journal of Political Economy, University of Chicago Press, vol. 106(5), pages 958-996, October.
    19. Martin Ravallion & Emanuela Galasso & Teodoro Lazo & Ernesto Philipp, 2005. "What Can Ex-Participants Reveal about a Program’s Impact?," Journal of Human Resources, University of Wisconsin Press, vol. 40(1).
    20. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    21. John Newman & Menno Pradhan & Laura B. Rawlings & Geert Ridder & Ramiro Coa & Jose Luis Evia, 2002. "An Impact Evaluation of Education, Health, and Water Supply Investments by the Bolivian Social Investment Fund," World Bank Economic Review, World Bank Group, vol. 16(2), pages 241-274, August.
    22. Behrman, Jere R & Sengupta, Piyali & Todd, Petra, 2005. "Progressing through PROGRESA: An Impact Assessment of a School Subsidy Experiment in Rural Mexico," Economic Development and Cultural Change, University of Chicago Press, vol. 54(1), pages 237-75, October.
    23. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, 01.
    24. Martin Ravallion & Gaurav Datt, 1995. "Is Targeting Through a Work Requirement Efficient? Some Evidence for Rural India," Monash Economics Working Papers archive-41, Monash University, Department of Economics.
    25. Joshua D. Angrist & Victor Lavy, 1999. "Using Maimonides' Rule to Estimate the Effect of Class Size on Scholastic Achievement," The Quarterly Journal of Economics, Oxford University Press, vol. 114(2), pages 533-575.
    26. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    27. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," NBER Working Papers 6699, National Bureau of Economic Research, Inc.
    28. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-20, September.
    29. Gary Burtless, 1985. "Are Targeted Wage Subsidies Harmful? Evidence from a Wage Voucher Experiment," ILR Review, Cornell University, ILR School, vol. 39(1), pages 105-114, October.
    30. Case, Anne & Deaton, Angus, 1998. "Large Cash Transfers to the Elderly in South Africa," Economic Journal, Royal Economic Society, vol. 108(450), pages 1330-61, September.
    31. Carneiro, Pedro & Hansen, Karsten T & Heckman, James J, 2002. "Removing the veil of ignorance in assessing the distributional impacts of social policies," Working Paper Series 2002:2, IFAU - Institute for Evaluation of Labour Market and Education Policy.
    32. Joshua Angrist & Jinyong Hahn, 2004. "When to Control for Covariates? Panel Asymptotics for Estimates of Treatment Effects," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 58-72, February.
    33. Brian A. Jacob & Lars Lefgren, 2002. "Remedial Education and Student Achievement: A Regression-Discontinuity Analysis," NBER Working Papers 8918, National Bureau of Economic Research, Inc.
    34. Hoddinott, John & Skoufias, Emmanual, 2003. "The impact of Progresa on food consumption," FCND briefs 150, International Food Policy Research Institute (IFPRI).
    35. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    36. Atkinson, A B, 1987. "On the Measurement of Poverty," Econometrica, Econometric Society, vol. 55(4), pages 749-64, July.
    37. Rosenzweig, Mark R & Wolpin, Kenneth I, 1986. "Evaluating the Effects of Optimally Distributed Public Programs: ChildHealth and Family Planning Interventions," American Economic Review, American Economic Association, vol. 76(3), pages 470-82, June.
    38. Anne Morrison Piehl & Suzanne J. Cooper & Anthony A. Braga & David M. Kennedy, 2003. "Testing for Structural Breaks in the Evaluation of Programs," The Review of Economics and Statistics, MIT Press, vol. 85(3), pages 550-558, August.
    39. Robert A. Moffitt, 2003. "The Role of Randomized Field Trials in Social Science Research: A Perspective from Evaluations of Reforms of Social Welfare Programs," NBER Technical Working Papers 0295, National Bureau of Economic Research, Inc.
    40. Joshua D. Angrist & Guido W. Imbens, 1995. "Identification and Estimation of Local Average Treatment Effects," NBER Technical Working Papers 0118, National Bureau of Economic Research, Inc.
    41. Raghav Gaiha & Katsushi Imai, 2002. "Rural Public Works and Poverty Alleviation--the case of the employment guarantee scheme in Maharashtra," International Review of Applied Economics, Taylor & Francis Journals, vol. 16(2), pages 131-151.
    42. Joshua D. Angrist & Alan B. Krueger, 2001. "Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments," Journal of Economic Perspectives, American Economic Association, vol. 15(4), pages 69-85, Fall.
    43. Michael Lokshin & Martin Ravallion, 2000. "Welfare Impacts of the 1998 Financial Crisis in Russia and the Response of the Public Safety Net," The Economics of Transition, The European Bank for Reconstruction and Development, vol. 8(2), pages 269-295, July.
    44. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    45. Sadoulet, Elisabeth & Janvry, Alain de & Davis, Benjamin, 2001. "Cash Transfer Programs with Income Multipliers: PROCAMPO in Mexico," World Development, Elsevier, vol. 29(6), pages 1043-1056, June.
    46. Sebastian Galiani & Paul Gertler & Ernesto Schargrodsky, 2002. "Water for Life: The Impact of the Privatization of Water Services on Child Mortality," Working Papers 54, Universidad de San Andres, Departamento de Economia, revised Sep 2005.
    47. Paul Gertler, 2004. "Do Conditional Cash Transfers Improve Child Health? Evidence from PROGRESA's Control Randomized Experiment," American Economic Review, American Economic Association, vol. 94(2), pages 336-341, May.
    48. Foster, James & Greer, Joel & Thorbecke, Erik, 1984. "A Class of Decomposable Poverty Measures," Econometrica, Econometric Society, vol. 52(3), pages 761-66, May.
    49. repec:fth:prinin:455 is not listed on IDEAS
    50. Rao, Vijayendra & Ibanez, Ana Maria, 2003. "The social impact of social funds in Jamaica - a mixed-methods analysis of participation, targeting, and collective action in community-driven development," Policy Research Working Paper Series 2970, The World Bank.
    51. Jeffrey Smith & Petra Todd, 2003. "Does Matching Overcome Lalonde's Critique of Nonexperimental Estimators?," University of Western Ontario, Centre for Human Capital and Productivity (CHCP) Working Papers 20035, University of Western Ontario, Centre for Human Capital and Productivity (CHCP).
    52. Paul Schultz, T., 2004. "School subsidies for the poor: evaluating the Mexican Progresa poverty program," Journal of Development Economics, Elsevier, vol. 74(1), pages 199-250, June.
    53. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    54. Hahn, Jinyong & Todd, Petra & Van der Klaauw, Wilbert, 2001. "Identification and Estimation of Treatment Effects with a Regression-Discontinuity Design," Econometrica, Econometric Society, vol. 69(1), pages 201-09, January.
    55. Basu, Kaushik & Narayan, Ambar & Ravallion, Martin, 1999. "Is knowledge shared within households?," Policy Research Working Paper Series 2261, The World Bank.
    56. Joshua Angrist & Alan Krueger, 2001. "Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments," Working Papers 834, Princeton University, Department of Economics, Industrial Relations Section..
    57. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    58. Markus Frölich, 2004. "Finite-Sample Properties of Propensity-Score Matching and Weighting Estimators," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 77-90, February.
    59. Smith, Jeffrey & Todd, Petra, 2005. "Rejoinder," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 365-375.
    60. Esther Duflo, 2000. "Grandmothers and Granddaughters: Old Age Pension and Intra-household Allocation in South Africa," NBER Working Papers 8061, National Bureau of Economic Research, Inc.
    61. van de Walle, Dominique, 2002. "Choosing Rural Road Investments to Help Reduce Poverty," World Development, Elsevier, vol. 30(4), pages 575-589, April.
    62. van de Walle, Dominique, 2004. "Testing Vietnam's public safety net," Journal of Comparative Economics, Elsevier, vol. 32(4), pages 661-679, December.
    63. repec:pri:indrel:455 is not listed on IDEAS
    64. Jalan, Jyotsna & Ravallion, Martin, 1998. "Are there dynamic gains from a poor-area development program?," Journal of Public Economics, Elsevier, vol. 67(1), pages 65-85, January.
    65. Robert S. Chase, 2002. "Supporting Communities in Transition: The Impact of the Armenian Social Investment Fund," World Bank Economic Review, World Bank Group, vol. 16(2), pages 219-240, August.
    66. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 605-654.
    67. Deaton, Angus, 1995. "Data and econometric tools for development analysis," Handbook of Development Economics, in: Hollis Chenery & T.N. Srinivasan (ed.), Handbook of Development Economics, edition 1, volume 3, chapter 33, pages 1785-1882 Elsevier.
    68. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 487-535.
    69. Erich Battistin & Enrico Rettore, 2002. "Testing for programme effects in a regression discontinuity design with imperfect compliance," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 165(1), pages 39-57.
    70. Petra E. Todd & Jeffrey A. Smith, 2001. "Reconciling Conflicting Evidence on the Performance of Propensity-Score Matching Methods," American Economic Review, American Economic Association, vol. 91(2), pages 112-118, May.
    71. Christina Paxson & Norbert R. Schady, 2002. "The Allocation and Impact of Social Funds: Spending on School Infrastructure in Peru," World Bank Economic Review, World Bank Group, vol. 16(2), pages 297-319, August.
    72. Dehejia, Rajeev, 2005. "Practical propensity score matching: a reply to Smith and Todd," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 355-364.
    Full references (including those not matched with items on IDEAS)

    This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

    When requesting a correction, please mention this item's handle: RePEc:eee:devchp:5-59. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Shamier, Wendy)

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If references are entirely missing, you can add them using this form.

    If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.