IDEAS home Printed from
   My bibliography  Save this paper

Predicting the Efficacy of Future Training Programs Using Past Experiences


  • V. Joseph Hotz
  • Guido W. Imbens
  • Julie H. Mortimer


We investigate the problem of predicting the average effect of a new training program using experiences with previous implementations. There are two principal complications in doing so. First, the population in which the new program will be implemented may differ from the population in which the old program was implemented. Second, the two programs may differ in the mix of their components. With sufficient detail on characteristics of the two populations and sufficient overlap in their distributions, one may be able to adjust for differences due to the first complication. Dealing with the second difficulty requires data on the exact treatments the individuals received. However even in the presence of differences in the mix of components across training programs comparisons of controls in both populations who were excluded from participating in any of the programs should not be affected. To investigate the empirical importance of these issues, we compare four job training pro-grams implemented in the mid-eighties in different parts of the U.S. We find that adjusting for pre-training earnings and individual characteristics removes most of the differences between control units, but that even after such adjustments, post-training earnings for trainees are not comparable. We surmise that differences in treatment components across training programs are the likely cause, and that more details on the specific services provided by these programs are necessary to predict the effect of future programs. We also conclude that effect heterogeneity, it is essential, even in experimental evaluations of training programs record pre-training earnings and individual characteristics in order to render the extrapolation of the results to different locations more credible.

Suggested Citation

  • V. Joseph Hotz & Guido W. Imbens & Julie H. Mortimer, 1999. "Predicting the Efficacy of Future Training Programs Using Past Experiences," NBER Technical Working Papers 0238, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberte:0238

    Download full text from publisher

    File URL:
    Download Restriction: no

    References listed on IDEAS

    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
    3. Ashenfelter, Orley & Card, David, 1985. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," The Review of Economics and Statistics, MIT Press, vol. 67(4), pages 648-660, November.
    4. Meyer, Bruce D, 1995. "Natural and Quasi-experiments in Economics," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 151-161, April.
    5. Imbens, G.W. & Rubin, D. & Sacerdote, B., 1999. "Estimating the effect of unearned income on labor supply, earnings, savings and consumption : Evidence from a survey of lottery players," Discussion Paper 99.34, Tilburg University, Center for Economic Research.
    6. Card, David & Sullivan, Daniel G, 1988. "Measuring the Effect of Subsidized Training Programs on Movements in and out of Employment," Econometrica, Econometric Society, vol. 56(3), pages 497-530, May.
    7. Gueron, Judith M, 1990. "Work and Welfare: Lessons on Employment Programs," Journal of Economic Perspectives, American Economic Association, vol. 4(1), pages 79-98, Winter.
    8. Lechner, Michael, 1999. "Earnings and Employment Effects of Continuous Off-the-Job Training in East Germany after Unification," Journal of Business & Economic Statistics, American Statistical Association, vol. 17(1), pages 74-90, January.
    9. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    10. Joshua D. Angrist, 1998. "Estimating the Labor Market Impact of Voluntary Military Service Using Social Security Data on Military Applicants," Econometrica, Econometric Society, vol. 66(2), pages 249-288, March.
    11. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    12. Charles F. Manski, 1997. "The Mixing Problem in Programme Evaluation," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 537-553.
    Full references (including those not matched with items on IDEAS)

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:


    Access and download statistics


    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberte:0238. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (). General contact details of provider: .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.