The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program
As part of recent reforms of the welfare programs in the U.S., many states and localities have refocused their Welfare-to-Work programs from an emphasis on human capital acquisition (i.e., providing basic education and vocational training) to an emphasis on "work-first," (i.e., moving welfare recipients into unsubsidized employment as quickly as possible). This change in emphasis has been motivated, in part, by results from the experimental evaluation, conducted by the Manpower Demonstration Research Corporation (MDRC), of California's Greater Avenues to Independence (GAIN) programs during the early 1990s. Their evaluation found that, compared to programs in other counties that emphasized skill accumulation, the work-first program in Riverside County had larger effects on employment earnings, and welfare receipt. In addition, the Riverside program was cheaper per recipient than the other programs. This paper reexamines the GAIN program from two complementary perspectives. First, the authors extend the earlier analysis through nine years post-randomization, which is the longest follow-up of any randomized training program, and find that the stronger impacts for Riverside County's work first program tend to shrink, whereas the weaker impacts for the human capital programs in Alameda and Los Angeles Counties tend to remain constant or even grow over time. Second, the authors develop and implement methods to allow the comparison of programs implemented by random assignment in different places despite striking differences in the composition of the participant populations. On a substantive level, the authors' reexamination of the GAIN experiment leads them to conclude that although the work-first programs were more successful than the human capital accumulation programs in the early years, this relative advantage disappears in later years. On a methodological level, the authors' results suggest that, at least in this welfare context, these methods are a promising approach both for the estimation of program effects from non-experimental data and for extrapolating program results from one location to a different location with a different population mix.
|Date of creation:||Mar 2001|
|Date of revision:|
|Contact details of provider:|| Postal: |
Web page: http://www.rand.org/pubs/
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Couch, Kenneth A, 1992. "New Evidence on the Long-Term Effects of Employment Training Programs," Journal of Labor Economics, University of Chicago Press, vol. 10(4), pages 380-88, October.
- James J. Heckman & Lance Lochner & Christopher Taber, 1998.
"General Equilibrium Treatment Effects: A Study of Tuition Policy,"
NBER Working Papers
6426, National Bureau of Economic Research, Inc.
- Heckman, James J & Lochner, Lance & Taber, Christopher, 1998. "General-Equilibrium Treatment Effects: A Study of Tuition Policy," American Economic Review, American Economic Association, vol. 88(2), pages 381-86, May.
- V. Joseph Hotz & Guido W. Imbens & Julie H. Mortimer, 1999. "Predicting the Efficacy of Future Training Programs Using Past Experiences," NBER Technical Working Papers 0238, National Bureau of Economic Research, Inc.
- James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998.
"Characterizing Selection Bias Using Experimental Data,"
Econometric Society, vol. 66(5), pages 1017-1098, September.
- James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," NBER Working Papers 6699, National Bureau of Economic Research, Inc.
- Heckman, J.J. & Hotz, V.J., 1988.
"Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training,"
University of Chicago - Economics Research Center
88-12, Chicago - Economics Research Center.
- James J. Heckman, 1989. "Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training," NBER Working Papers 2861, National Bureau of Economic Research, Inc.
- Rajeev Dehejia, 2000. "Was There a Riverside Miracle? A Framework for Evaluating Multi-Site Programs," NBER Working Papers 7844, National Bureau of Economic Research, Inc.
- Daniel Friedlander & David H. Greenberg & Philip K. Robins, 1997. "Evaluating Government Training Programs for the Economically Disadvantaged," Journal of Economic Literature, American Economic Association, vol. 35(4), pages 1809-1855, December.
- Robert J. LaLonde, 1995. "The Promise of Public Sector-Sponsored Training Programs," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 149-168, Spring.
- Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097 Elsevier.
- LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-20, September.
- Heckman, James J & Ichimura, Hidehiko & Todd, Petra, 1998. "Matching as an Econometric Evaluation Estimator," Review of Economic Studies, Wiley Blackwell, vol. 65(2), pages 261-94, April.
- Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
When requesting a correction, please mention this item's handle: RePEc:ran:wpaper:01-03. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Benson Wong)
If references are entirely missing, you can add them using this form.