IDEAS home Printed from https://ideas.repec.org/h/nbr/nberch/6810.html
   My bibliography  Save this book chapter

The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)

In: Youth Employment and Joblessness in Advanced Countries

Author

Listed:
  • James J. Heckman
  • Jeffrey Smith

Abstract

The recent experimental evaluation of the U.S. Job Training Partnership Act (JTPA) program found negative effects of training on the earnings of disadvantaged male youth and no effect on the earnings of disadvantaged female youth. These findings provided justification for Congress to cut the budget of JTPA's youth component by over 80 percent. In this paper, we examine the sensitivity of the experimental impact estimates along several dimensions of construction and interpretation. We find that the statistical significance of the male youth estimates is extremely fragile and that the magnitudes of the estimates for both youth groups are sensitive to nearly all the factors we consider. In particular, accounting for experimental control group members who substitute training from other providers leads to a much more positive picture regarding the effectiveness of JTPA classroom training. Our study indicates the value of sensitivity analyses in experimental evaluations and illustrates that experimental impact estimates, like those from nonexperimental analyses, require careful interpretation if they are to provide a reliable guide to policymakers.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • James J. Heckman & Jeffrey Smith, 2000. "The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)," NBER Chapters, in: Youth Employment and Joblessness in Advanced Countries, pages 331-356, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberch:6810
    as

    Download full text from publisher

    File URL: http://www.nber.org/chapters/c6810.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, , vol. 8(2), pages 225-246, April.
    2. Robert J. LaLonde, 1995. "The Promise of Public Sector-Sponsored Training Programs," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 149-168, Spring.
    3. repec:mpr:mprres:2737 is not listed on IDEAS
    4. Katherine P. Dickinson & Terry R. Johnson & Richard W. West, 1987. "An Analysis of the Sensitivity of Quasi-Experimental Net Impact Estimates of Ceta Programs," Evaluation Review, , vol. 11(4), pages 452-472, August.
    5. James Heckman & Jeffrey Smith & Christopher Taber, 1994. "Accounting for Dropouts in Evaluations of Social Experiments," NBER Technical Working Papers 0166, National Bureau of Economic Research, Inc.
    6. Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
    7. Jeffrey I. Steinfeld, 1999. "Book," Journal of Industrial Ecology, Yale University, vol. 3(4), pages 145-147, October.
    8. Theresa J. Devine & James J. Heckman, 1996. "The Economics of Eligibility Rules for a Social Program: A Study of the Job Training Partnership Act (JTPA)--A Summary Report," Canadian Journal of Economics, Canadian Economics Association, vol. 29(s1), pages 99-104, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. repec:mpr:mprres:6664 is not listed on IDEAS
    2. Markus Frölich & Michael Lechner, 2004. "Regional treatment intensity as an instrument for the evaluation of labour market policies," University of St. Gallen Department of Economics working paper series 2004 2004-08, Department of Economics, University of St. Gallen.
    3. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute of Labor Economics (IZA).
    4. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    5. Rajeev Dehejia, 2000. "Was There a Riverside Miracle? A Framework for Evaluating Multi-Site Programs," NBER Working Papers 7844, National Bureau of Economic Research, Inc.
    6. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    7. Mitali Das, 2000. "Instrumental Variables Estimation of Nonparametric Models with Discrete Endogenous Regressors," Econometric Society World Congress 2000 Contributed Papers 1008, Econometric Society.
    8. Samina Sattar, "undated". "Evidence Scan of Work Experience Programs," Mathematica Policy Research Reports 685b2a650bdf4437b1de31c2e, Mathematica Policy Research.
    9. Agata Maida & Daniela Sonedda, 2019. "Getting out of the starting gate on the right foot: employment effects of investment in human capital," LABORatorio R. Revelli Working Papers Series 164, LABORatorio R. Revelli, Centre for Employment Studies.
    10. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
    11. Carolyn Heinrich & Jeffrey Wenger, 2002. "The Economic Contributions of James J. Heckman and Daniel L. McFadden," Review of Political Economy, Taylor & Francis Journals, vol. 14(1), pages 69-89.
    12. Smith, Jeffrey A. & Whalley, Alexander & Wilcox, Nathaniel T., 2020. "Are Program Participants Good Evaluators?," IZA Discussion Papers 13584, Institute of Labor Economics (IZA).
    13. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    2. Regner, Hakan, 2002. "A nonexperimental evaluation of training programs for the unemployed in Sweden," Labour Economics, Elsevier, vol. 9(2), pages 187-206, April.
    3. Lawrence Katz & B. Jeffrey Liebman, 2000. "Moving to Opportunity in Boston: Early Results of a Randomized Mobility Experiment," Working Papers 820, Princeton University, Department of Economics, Industrial Relations Section..
    4. Philip J. O'Connell, 1999. "Are they working? Market Orientation and the Effectiveness of Active Labour Market Programmes in Ireland," Papers WP105, Economic and Social Research Institute (ESRI).
    5. Lawrence F. Katz & Jeffrey R. Kling & Jeffrey B. Liebman, 2001. "Moving to Opportunity in Boston: Early Results of a Randomized Mobility Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 116(2), pages 607-654.
    6. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    7. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    8. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    9. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    10. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    11. Fortin, Bernard, 1997. "Dépendance à l’égard de l’aide sociale et réforme de la sécurité du revenu," L'Actualité Economique, Société Canadienne de Science Economique, vol. 73(4), pages 557-573, décembre.
    12. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    13. Gordon Hanson & Chen Liu & Craig McIntosh, 2017. "The Rise and Fall of U.S. Low-Skilled Immigration," Brookings Papers on Economic Activity, Economic Studies Program, The Brookings Institution, vol. 48(1 (Spring), pages 83-168.
    14. Nicola Pavoni & G. L. Violante, 2007. "Optimal Welfare-to-Work Programs," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 74(1), pages 283-318.
    15. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    16. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    17. Peter Z. Schochet & Ronald D'Amico & Jillian Berk & Sarah Dolfin & Nathan Wozny, "undated". "Estimated Impacts for Participants in the Trade Adjustment Assistance (TAA) Program Under the 2002 Amendments," Mathematica Policy Research Reports 582d8723f6884d4eb7a3f95a4, Mathematica Policy Research.
    18. Chabé-Ferret, Sylvain, 2017. "Should We Combine Difference In Differences with Conditioning on Pre-Treatment Outcomes?," TSE Working Papers 17-824, Toulouse School of Economics (TSE).
    19. Joshua D. Angrist, 2004. "Treatment effect heterogeneity in theory and practice," Economic Journal, Royal Economic Society, vol. 114(494), pages 52-83, March.
    20. Alan S Blinder, 2007. "Offshoring: Big Deal, or Business as Usual?," Working Papers 149, Princeton University, Department of Economics, Center for Economic Policy Studies..

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberch:6810. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.