The Sensitivity of Experimental Impact Estimates: Evidence from the National JTPA Study
AbstractThe recent experimental evaluation of the U.S. Job Training Partnership Act (JTPA) program found negative effects of training on the earnings of disadvantaged male youth and no effect on the earnings of disadvantaged female youth. These findings provided justification for Congress to cut the budget of JTPA's youth component by over 80 percent. In this paper, we examine the sensitivity of the experimental impact estimates along several dimensions of construction and interpretation. We find that the statistical significance of the male youth estimates is extremely fragile and that the magnitudes of the estimates for both youth groups are sensitive to nearly all the factors we consider. In particular, accounting for experimental control group members who substitute training from other providers leads to a much more positive picture regarding the effectiveness of JTPA classroom training. Our study indicates the value of sensitivity analyses in experimental evaluations and illustrates that experimental impact estimates, like those from nonexperimental analyses, require careful interpretation if they are to provide a reliable guide to policymakers.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by National Bureau of Economic Research, Inc in its series NBER Working Papers with number 6105.
Date of creation: Jul 1997
Date of revision:
Publication status: published as Youth Employment and Joblessness in Advanced Countries, Blanchflower, David and Richard Freeman, eds., Chicago: University of Chicago Press, 2000, pp. 331-356.
Contact details of provider:
Postal: National Bureau of Economic Research, 1050 Massachusetts Avenue Cambridge, MA 02138, U.S.A.
Web page: http://www.nber.org
More information through EDIRC
Other versions of this item:
- James J. Heckman & Jeffrey Smith, 2000. "The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)," NBER Chapters, in: Youth Employment and Joblessness in Advanced Countries, pages 331-356 National Bureau of Economic Research, Inc.
- C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
- H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- James Heckman & Jeffrey Smith & Christopher Taber, 1994.
"Accounting for Dropouts in Evaluations of Social Experiments,"
NBER Technical Working Papers
0166, National Bureau of Economic Research, Inc.
- Heckman, J. & Smith, J. & Taber, C., 1994. "Accounting for Dropouts in Evaluations of Social Experiments," University of Chicago - Economics Research Center 94-3, Chicago - Economics Research Center.
- Charles Mallar & Stuart Kerachsky & Craig Thornton & David Long., 1982. "Evaluation of the Impact of the Job Corps Program: Third Follow-Up Report," Mathematica Policy Research Reports 2737, Mathematica Policy Research.
- Robert J. LaLonde, 1995. "The Promise of Public Sector-Sponsored Training Programs," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 149-168, Spring.
- Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
- Jeffrey Smith, 2000.
"A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies,"
UWO Department of Economics Working Papers
20006, University of Western Ontario, Department of Economics.
- Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
- Mitali Das, 2000. "Instrumental Variables Estimation of Nonparametric Models with Discrete Endogenous Regressors," Econometric Society World Congress 2000 Contributed Papers 1008, Econometric Society.
- Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
- Rajeev Dehejia, 2000. "Was There a Riverside Miracle? A Framework for Evaluating Multi-Site Programs," NBER Working Papers 7844, National Bureau of Economic Research, Inc.
- Fröhlich, Markus & Lechner, Michael, 2004.
"Regional Treatment Intensity as an Instrument for the Evaluation of Labour Market Policies,"
CEPR Discussion Papers
4304, C.E.P.R. Discussion Papers.
- Frölich, Markus & Lechner, Michael, 2004. "Regional Treatment Intensity as an Instrument for the Evaluation of Labour Market Policies," IZA Discussion Papers 1095, Institute for the Study of Labor (IZA).
- Markus Frölich & Michael Lechner, 2004. "Regional treatment intensity as an instrument for the evaluation of labour market policies," University of St. Gallen Department of Economics working paper series 2004 2004-08, Department of Economics, University of St. Gallen.
- Carolyn Heinrich & Jeffrey Wenger, 2002. "The Economic Contributions of James J. Heckman and Daniel L. McFadden," Review of Political Economy, Taylor & Francis Journals, vol. 14(1), pages 69-89.
- Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute for the Study of Labor (IZA).
If references are entirely missing, you can add them using this form.