IDEAS home Printed from https://ideas.repec.org/p/idb/brikps/2999.html
   My bibliography  Save this paper

An Assessment of Propensity Score Matching as a Non Experimental Impact Estimator: Evidence from Mexico's PROGRESA Program

Author

Listed:
  • Díaz, Juan José
  • Handa, Sudhanshu

Abstract

In this working paper the authors present evidence on the reliability of propensity score matching to estimate the bias associated with the effect of treatment on the treated, exploiting the availability of experimental data from a Mexican antipoverty program (PROGRESA: Programa de Educación, Salud y Alimentación). The data comes from several outcomes such as food expenditure and child schooling and labor. The methodology compares the results of the experimental impact estimator with those using matched samples drawn from a (non-experimental) national survey carried out to measure household income and expenditures. The results show that simple-cross sectional matching produces significant bias for outcomes measured in different ways. Results are more positive for outcomes measured similarly across survey instruments, but even in this case there are indications of bias depending on sample and matching method.

Suggested Citation

  • Díaz, Juan José & Handa, Sudhanshu, 2005. "An Assessment of Propensity Score Matching as a Non Experimental Impact Estimator: Evidence from Mexico's PROGRESA Program," IDB Publications (Working Papers) 2999, Inter-American Development Bank.
  • Handle: RePEc:idb:brikps:2999
    as

    Download full text from publisher

    File URL: https://publications.iadb.org/publications/english/document/An-Assessment-of-Propensity-Score-Matching-as-a-Non-Experimental-Impact-Estimator-Evidence-from-Mexico-PROGRESA-Program.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. Barbara Sianesi, 2004. "An Evaluation of the Swedish System of Active Labor Market Programs in the 1990s," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 133-155, February.
    4. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    5. Charles Michalopoulos & Howard S. Bloom & Carolyn J. Hill, 2004. "Can Propensity-Score Methods Match the Findings from a Random Assignment Evaluation of Mandatory Welfare-to-Work Programs?," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 156-179, February.
    6. David I. Levine & Gary Painter, 2003. "The Schooling Costs of Teenage Out-of-Wedlock Childbearing: Analysis with a Within-School Propensity-Score-Matching Estimator," The Review of Economics and Statistics, MIT Press, vol. 85(4), pages 884-900, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    2. Jason K. Luellen & William R. Shadish & M. H. Clark, 2005. "Propensity Scores," Evaluation Review, , vol. 29(6), pages 530-558, December.
    3. Gonzalo Nunez-Chaim & Henry G. Overman & Capucine Riom, 2024. "Does subsidising business advice improve firm performance? Evidence from a large RCT," CEP Discussion Papers dp1977, Centre for Economic Performance, LSE.
    4. Guo, Shenyang & Barth, Richard P. & Gibbons, Claire, 2006. "Propensity score matching strategies for evaluating substance abuse services for child welfare clients," Children and Youth Services Review, Elsevier, vol. 28(4), pages 357-383, April.
    5. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    6. Kebede, Dereje & Emana, Bezabih & Tesfay, Girmay, 2023. "Impact of land acquisition for large-scale agricultural investments on food security status of displaced households: The case of Ethiopia," Land Use Policy, Elsevier, vol. 126(C).
    7. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    8. Centeno, Luis & Centeno, Mário & Novo, Álvaro A., 2009. "Evaluating job-search programs for old and young individuals: Heterogeneous impact on unemployment duration," Labour Economics, Elsevier, vol. 16(1), pages 12-25, January.
    9. Eva O. Arceo-Gómez & Raymundo M. Campos-Vázquez, 2014. "Teenage Pregnancy in Mexico: Evolution and Consequences," Latin American Journal of Economics-formerly Cuadernos de Economía, Instituto de Economía. Pontificia Universidad Católica de Chile., vol. 51(1), pages 109-146, May.
    10. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    11. Alberto Abadie & Guido W. Imbens, 2008. "On the Failure of the Bootstrap for Matching Estimators," Econometrica, Econometric Society, vol. 76(6), pages 1537-1557, November.
    12. Aradhna Aggarwal, 2010. "Impact evaluation of India's ‘Yeshasvini’ community‐based health insurance programme," Health Economics, John Wiley & Sons, Ltd., vol. 19(S1), pages 5-35, September.
    13. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    14. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    15. Richard P. Nathan, 2008. "The role of random assignment in social policy research," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(2), pages 401-415.
    16. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    17. Andrew P. Jaciw, 2016. "Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach," Evaluation Review, , vol. 40(3), pages 199-240, June.
    18. Daniel Litwok, 2023. "Estimating the Impact of Emergency Assistance on Educational Progress for Low-Income Adults: Experimental and Nonexperimental Evidence," Evaluation Review, , vol. 47(2), pages 231-263, April.
    19. Dettmann, E. & Becker, C. & Schmeißer, C., 2011. "Distance functions for matching in small samples," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1942-1960, May.
    20. Alexander Hijzen & Sébastien Jean & Thierry Mayer, 2011. "The effects at home of initiating production abroad: evidence from matched French firms," Review of World Economics (Weltwirtschaftliches Archiv), Springer;Institut für Weltwirtschaft (Kiel Institute for the World Economy), vol. 147(3), pages 457-483, September.

    More about this item

    Keywords

    WP-04/05;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:idb:brikps:2999. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Felipe Herrera Library (email available below). General contact details of provider: https://edirc.repec.org/data/iadbbus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.