Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training
AbstractThe recent literature on evaluating manpower training programs demonstrates that alternative nonexperimental estimators of the same program produce a array of estimates of program impact. These findings have led to the call for experiments to be used to perform credible program evaluations. Missing in all of the recent pessimistic analyses of nonexperimental methods is any systematic discussion of how to choose among competing estimators. This paper explores the value of simple specification tests in selecting an appropriate nonexperimental estimator. A reanalysis of the National Supported Work Demonstration Data previously analyzed by proponents of social experiments reveals that a simple testing procedure eliminates the range of nonexperimental estimators that are at variance with the experimental estimates of program impact.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by National Bureau of Economic Research, Inc in its series NBER Working Papers with number 2861.
Date of creation: Feb 1989
Date of revision:
Contact details of provider:
Postal: National Bureau of Economic Research, 1050 Massachusetts Avenue Cambridge, MA 02138, U.S.A.
Web page: http://www.nber.org
More information through EDIRC
Other versions of this item:
- Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Gary Burtless & Larry L. Orr, 1986. "Are Classical Experiments Needed for Manpower Policy," Journal of Human Resources, University of Wisconsin Press, vol. 21(4), pages 606-639.
- Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586 National Bureau of Economic Research, Inc.
- Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
- LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-20, September.
- Ashenfelter, Orley & Card, David, 1985.
"Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs,"
The Review of Economics and Statistics,
MIT Press, vol. 67(4), pages 648-60, November.
- Orley Ashenfelter & David Card, 1984. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," NBER Working Papers 1489, National Bureau of Economic Research, Inc.
- Orley Ashenfelter & David Card, 1984. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," Working Papers 554, Princeton University, Department of Economics, Industrial Relations Section..
This item has more than 25 citations. To prevent cluttering this page, these citations are listed on a separate page. reading list or among the top items on IDEAS.Access and download statisticsgeneral information about how to correct material in RePEc.
If references are entirely missing, you can add them using this form.