IDEAS home Printed from https://ideas.repec.org/a/ses/arsjes/2000-iii-2.html
   My bibliography  Save this article

A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies

Author

Listed:
  • Jeffrey Smith

Abstract

This paper considers different methods for solving the evaluation problem. I highlight the role of heterogeneity in program impacts in defining evaluation parameters of interest and in interpreting estimated program impacts. I discuss the strengths and weaknesses of social experiments and conclude that they require careful implementation and interpretation. I review and critique two popular non-experimental evaluation methods: difference-in-differences and propensity score matching. I find that the former relies on assumptions at odds with the empirical data and that the latter is not a magical solution to all evaluation problems. Finally, I argue for the importance of paying attention to data quality and general equilibrium effects.

Suggested Citation

  • Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
  • Handle: RePEc:ses:arsjes:2000-iii-2
    as

    Download full text from publisher

    File URL: http://www.sjes.ch/papers/2000-III-2.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, , vol. 8(2), pages 225-246, April.
    2. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 487-535.
    3. Davidson, Carl & Woodbury, Stephen A, 1993. "The Displacement Effect of Reemployment Bonus Programs," Journal of Labor Economics, University of Chicago Press, vol. 11(4), pages 575-605, October.
    4. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    5. Michael Lechner, 1999. "Nonparametric bounds on employment and income effects of continuous vocational training in East Germany," Econometrics Journal, Royal Economic Society, vol. 2(1), pages 1-28.
    6. Dan A. Black & Jeffrey A. Smith & Mark C. Berger & Brett J. Noel, 2002. "Is the Threat of Reemployment Services More Effective than the Services Themselves? Experimental Evidence from the UI System," NBER Working Papers 8825, National Bureau of Economic Research, Inc.
    7. David G. Blanchflower & Richard B. Freeman, 2000. "Youth Employment and Joblessness in Advanced Countries," NBER Books, National Bureau of Economic Research, Inc, number blan00-1, March.
    8. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    9. Martin Feldstein & James M. Poterba, 1996. "Empirical Foundations of Household Taxation," NBER Books, National Bureau of Economic Research, Inc, number feld96-1, March.
    10. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    11. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    12. Angrist, Joshua D. & Krueger, Alan B., 1999. "Empirical strategies in labor economics," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 23, pages 1277-1366, Elsevier.
    13. Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
    14. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    15. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    16. James J. Heckman & Jeffrey Smith, 2000. "The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)," NBER Chapters, in: Youth Employment and Joblessness in Advanced Countries, pages 331-356, National Bureau of Economic Research, Inc.
    17. Anders Forslund & Alan B. Krueger, 1997. "An Evaluation of the Swedish Active Labor Market Policy: New and Received Wisdom," NBER Chapters, in: The Welfare State in Transition: Reforming the Swedish Model, pages 267-298, National Bureau of Economic Research, Inc.
    18. Robert Moffitt, 1991. "Program Evaluation With Nonexperimental Data," Evaluation Review, , vol. 15(3), pages 291-314, June.
    19. Dolton, Peter & O'Neill, Donal, 1996. "Unemployment Duration and the Restart Effect: Some Experimental Evidence," Economic Journal, Royal Economic Society, vol. 106(435), pages 387-400, March.
    20. Freeman, Richard B. & Topel, Robert H. & Swedenborg, Birgitta (ed.), 1997. "The Welfare State in Transition," National Bureau of Economic Research Books, University of Chicago Press, edition 1, number 9780226261782, December.
    21. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    22. Dan A. Black & Mark C. Berger & Jeffrey A. Smith & Brett J. Noel, 1999. "Is the Threat of Training More Effective Than Training Itself? Experimental Evidence from the UI System," University of Western Ontario, Departmental Research Report Series 9907, University of Western Ontario, Department of Economics.
    23. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    24. Gary Burtless & Larry L. Orr, 1986. "Are Classical Experiments Needed for Manpower Policy," Journal of Human Resources, University of Wisconsin Press, vol. 21(4), pages 606-639.
    25. Robert L & Rebecca Maynard, 1987. "How Precise Are Evaluations of Employment and Training Programs," Evaluation Review, , vol. 11(4), pages 428-451, August.
    26. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    27. Lechner, Michael, 1999. "Earnings and Employment Effects of Continuous Off-the-Job Training in East Germany after Unification," Journal of Business & Economic Statistics, American Statistical Association, vol. 17(1), pages 74-90, January.
    28. Howard S. Bloom & Larry L. Orr & Stephen H. Bell & George Cave & Fred Doolittle & Winston Lin & Johannes M. Bos, 1997. "The Benefits and Costs of JTPA Title II-A Programs: Key Findings from the National Job Training Partnership Act Study," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 549-576.
    29. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    30. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    31. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    32. Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
    33. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    34. James Heckman & Jeffrey Smith & Christopher Taber, 1998. "Accounting For Dropouts In Evaluations Of Social Programs," The Review of Economics and Statistics, MIT Press, vol. 80(1), pages 1-14, February.
    35. Bruce D. Meyer, 1995. "Lessons from the U.S. Unemployment Insurance Experiments," Journal of Economic Literature, American Economic Association, vol. 33(1), pages 91-131, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    2. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    5. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    6. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    7. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    8. Michael Lechner, 2002. "Mikroökonometrische Evaluation arbeitsmarktpolitischer Massnahmen," University of St. Gallen Department of Economics working paper series 2002 2002-20, Department of Economics, University of St. Gallen.
    9. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    10. Deborah A. Cobb‐Clark & Thomas Crossley, 2003. "Econometrics for Evaluations: An Introduction to Recent Developments," The Economic Record, The Economic Society of Australia, vol. 79(247), pages 491-511, December.
    11. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    12. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Institute for Research on Labor and Employment, Working Paper Series qt6605k20b, Institute of Industrial Relations, UC Berkeley.
    13. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    14. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    15. Bryson, Alex & Dorsett, Richard & Purdon, Susan, 2002. "The use of propensity score matching in the evaluation of active labour market policies," LSE Research Online Documents on Economics 4993, London School of Economics and Political Science, LSE Library.
    16. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    17. Christian Durán, 2004. "Evaluación microeconométrica de las políticas públicas de empleo: aspectos metodológicos," Hacienda Pública Española / Review of Public Economics, IEF, vol. 170(3), pages 107-133, september.
    18. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    19. Dettmann, Eva & Becker, Claudia & Schmeißer, Christian, 2010. "Is there a Superior Distance Function for Matching in Small Samples?," IWH Discussion Papers 3/2010, Halle Institute for Economic Research (IWH).
    20. James Heckman & Salvador Navarro-Lozano, 2004. "Using Matching, Instrumental Variables, and Control Functions to Estimate Economic Choice Models," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 30-57, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ses:arsjes:2000-iii-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Steiner (email available below). General contact details of provider: https://edirc.repec.org/data/sgvssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.