Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide
The guide outlines the main evaluation challenges associated with ALMP’s, and shows how to obtain rigorous impact estimates using two leading evaluation approaches. The most credible and straightforward evaluation method is a randomized design, in which a group of potential participants is randomly divided into a treatment and a control group. Random assignment ensures that the two groups would have had similar experiences in the post-program period in the absence of the program intervention. The observed post-program difference therefore yields a reliable estimate of the program impact. The second approach is a difference in differences design that compares the change in outcomes between the participant group and a selected comparison group from before to after the completion of the program. In general the outcomes of the comparison group may differ from the outcomes of the participant group, even in the absence of the program intervention. If the difference observed prior to the program would have persisted in the absence of the program, however, then the change in the outcome gap between the two groups yields a reliable estimate of the program impact. This guideline reviews the various steps in the design and implementation of ALMP’s, and in subsequent analysis of the program data, that will ensure a rigorous and informative impact evaluation using either of these two techniques.
|Date of creation:||Oct 2011|
|Date of revision:|
|Contact details of provider:|| Postal: 1300 New York Avenue, NW, Washington, DC 20577|
Web page: http://www.iadb.org/spd
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006.
"Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Re-Analysis of the California GAIN Program,"
NBER Working Papers
11939, National Bureau of Economic Research, Inc.
- V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
- Charles F. Manski, 1989. "Anatomy of the Selection Problem," Journal of Human Resources, University of Wisconsin Press, vol. 24(3), pages 343-360.
- DiNardo, John & Lee, David S., 2011.
"Program Evaluation and Research Designs,"
Handbook of Labor Economics,
- Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
- Orley Ashenfelter & David Card, 1984.
"Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs,"
NBER Working Papers
1489, National Bureau of Economic Research, Inc.
- Ashenfelter, Orley & Card, David, 1985. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," The Review of Economics and Statistics, MIT Press, vol. 67(4), pages 648-60, November.
- repec:pri:indrel:dsp01ms35t863r is not listed on IDEAS
- David S. Lee & Thomas Lemieux, 2009.
"Regression Discontinuity Designs in Economics,"
NBER Working Papers
14723, National Bureau of Economic Research, Inc.
- John DiNardo & David S. Lee, 2010. "Program Evaluation and Research Designs," Working Papers 1228, Princeton University, Department of Economics, Industrial Relations Section..
When requesting a correction, please mention this item's handle: RePEc:idb:spdwps:1101. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Monica Bazan)
If references are entirely missing, you can add them using this form.