Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training
AbstractThe recent literature on evaluating manpower training programs demonstrates that alternative nonexperimental estimators of the same program produce a array of estimates of program impact. These findings have led to the call for experiments to be used to perform credible program evaluations. Missing in all of the recent pessimistic analyses of nonexperimental methods is any systematic discussion of how to choose among competing estimators. This paper explores the value of simple specification tests in selecting an appropriate nonexperimental estimator. A reanalysis of the National Supported Work Demonstration Data previously analyzed by proponents of social experiments reveals that a simple testing procedure eliminates the range of nonexperimental estimators that are at variance with the experimental estimates of program impact.
Download InfoTo our knowledge, this item is not available for download. To find whether it is available, there are three options:
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
Bibliographic InfoPaper provided by Chicago - Economics Research Center in its series University of Chicago - Economics Research Center with number 88-12.
Length: 27 pages
Date of creation: 1988
Date of revision:
Contact details of provider:
Postal: UNIVERSITY OF CHICAGO, ECONOMICS RESEARCH CENTER, NORC, CHICAGO ILLINOIS 60637 U.S.A.
Web page: http://economics.uchicago.edu/research.shtml
More information through EDIRC
econometric models ; evaluation ; training programmes ; manpower;
Other versions of this item:
- James J. Heckman, 1989. "Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training," NBER Working Papers 2861, National Bureau of Economic Research, Inc.
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Ashenfelter, Orley & Card, David, 1985.
"Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs,"
The Review of Economics and Statistics,
MIT Press, vol. 67(4), pages 648-60, November.
- Orley Ashenfelter & David Card, 1984. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," NBER Working Papers 1489, National Bureau of Economic Research, Inc.
- Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586 National Bureau of Economic Research, Inc.
- LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-20, September.
- Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
- Gary Burtless & Larry L. Orr, 1986. "Are Classical Experiments Needed for Manpower Policy," Journal of Human Resources, University of Wisconsin Press, vol. 21(4), pages 606-639.
This item has more than 25 citations. To prevent cluttering this page, these citations are listed on a separate page. reading list or among the top items on IDEAS.Access and download statisticsgeneral information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Thomas Krichel).
If references are entirely missing, you can add them using this form.