IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/6105.html
   My bibliography  Save this paper

The Sensitivity of Experimental Impact Estimates: Evidence from the National JTPA Study

Author

Listed:
  • James J. Heckman
  • Jeffrey A. Smith

Abstract

The recent experimental evaluation of the U.S. Job Training Partnership Act (JTPA) program found negative effects of training on the earnings of disadvantaged male youth and no effect on the earnings of disadvantaged female youth. These findings provided justification for Congress to cut the budget of JTPA's youth component by over 80 percent. In this paper, we examine the sensitivity of the experimental impact estimates along several dimensions of construction and interpretation. We find that the statistical significance of the male youth estimates is extremely fragile and that the magnitudes of the estimates for both youth groups are sensitive to nearly all the factors we consider. In particular, accounting for experimental control group members who substitute training from other providers leads to a much more positive picture regarding the effectiveness of JTPA classroom training. Our study indicates the value of sensitivity analyses in experimental evaluations and illustrates that experimental impact estimates, like those from nonexperimental analyses, require careful interpretation if they are to provide a reliable guide to policymakers.

Suggested Citation

  • James J. Heckman & Jeffrey A. Smith, 1997. "The Sensitivity of Experimental Impact Estimates: Evidence from the National JTPA Study," NBER Working Papers 6105, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:6105
    Note: LS
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w6105.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Katherine P. Dickinson & Terry R. Johnson & Richard W. West, 1987. "An Analysis of the Sensitivity of Quasi-Experimental Net Impact Estimates of Ceta Programs," Evaluation Review, , vol. 11(4), pages 452-472, August.
    2. James Heckman & Jeffrey Smith & Christopher Taber, 1994. "Accounting for Dropouts in Evaluations of Social Experiments," NBER Technical Working Papers 0166, National Bureau of Economic Research, Inc.
    3. Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, , vol. 8(2), pages 225-246, April.
    4. Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
    5. Robert J. LaLonde, 1995. "The Promise of Public Sector-Sponsored Training Programs," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 149-168, Spring.
    6. repec:mpr:mprres:2737 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Markus Frölich & Michael Lechner, 2004. "Regional treatment intensity as an instrument for the evaluation of labour market policies," University of St. Gallen Department of Economics working paper series 2004 2004-08, Department of Economics, University of St. Gallen.
    2. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute for the Study of Labor (IZA).
    3. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    4. Rajeev Dehejia, 2000. "Was There a Riverside Miracle? A Framework for Evaluating Multi-Site Programs," NBER Working Papers 7844, National Bureau of Economic Research, Inc.
    5. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters,in: Economics of Means-Tested Transfer Programs in the United States, volume 2, pages 127-234 National Bureau of Economic Research, Inc.
    6. Carolyn Heinrich & Jeffrey Wenger, 2002. "The Economic Contributions of James J. Heckman and Daniel L. McFadden," Review of Political Economy, Taylor & Francis Journals, vol. 14(1), pages 69-89.
    7. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    8. Mitali Das, 2000. "Instrumental Variables Estimation of Nonparametric Models with Discrete Endogenous Regressors," Econometric Society World Congress 2000 Contributed Papers 1008, Econometric Society.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:6105. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: () or (Joanne Lustig). General contact details of provider: http://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.