The Sensitivity of Experimental Impact Estimates: Evidence from the National JTPA Study
The recent experimental evaluation of the U.S. Job Training Partnership Act (JTPA) program found negative effects of training on the earnings of disadvantaged male youth and no effect on the earnings of disadvantaged female youth. These findings provided justification for Congress to cut the budget of JTPA's youth component by over 80 percent. In this paper, we examine the sensitivity of the experimental impact estimates along several dimensions of construction and interpretation. We find that the statistical significance of the male youth estimates is extremely fragile and that the magnitudes of the estimates for both youth groups are sensitive to nearly all the factors we consider. In particular, accounting for experimental control group members who substitute training from other providers leads to a much more positive picture regarding the effectiveness of JTPA classroom training. Our study indicates the value of sensitivity analyses in experimental evaluations and illustrates that experimental impact estimates, like those from nonexperimental analyses, require careful interpretation if they are to provide a reliable guide to policymakers.
|Date of creation:||Jul 1997|
|Publication status:||published as Youth Employment and Joblessness in Advanced Countries, Blanchflower, David and Richard Freeman, eds., Chicago: University of Chicago Press, 2000, pp. 331-356.|
|Contact details of provider:|| Postal: National Bureau of Economic Research, 1050 Massachusetts Avenue Cambridge, MA 02138, U.S.A.|
Web page: http://www.nber.org
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- James Heckman & Jeffrey Smith & Christopher Taber, 1994.
"Accounting for Dropouts in Evaluations of Social Experiments,"
NBER Technical Working Papers
0166, National Bureau of Economic Research, Inc.
- Heckman, J. & Smith, J. & Taber, C., 1994. "Accounting for Dropouts in Evaluations of Social Experiments," University of Chicago - Economics Research Center 94-3, Chicago - Economics Research Center.
- Robert J. LaLonde, 1995. "The Promise of Public Sector-Sponsored Training Programs," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 149-168, Spring.
- Katherine P. Dickinson & Terry R. Johnson & Richard W. West, 1987. "An Analysis of the Sensitivity of Quasi-Experimental Net Impact Estimates of Ceta Programs," Evaluation Review, SAGE Publishing, vol. 11(4), pages 452-472, August.
- Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
- Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, SAGE Publishing, vol. 8(2), pages 225-246, April.
- repec:mpr:mprres:2737 is not listed on IDEAS
When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:6105. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()
If references are entirely missing, you can add them using this form.