IDEAS home Printed from https://ideas.repec.org/a/wly/jpamgt/v25y2006i3p523-552.html
   My bibliography  Save this article

Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?

Author

Listed:
  • David H. Greenberg

    (University of Maryland, Baltimore County)

  • Charles Michalopoulos

    (MDRC, New York City)

  • Philip K. Robin

    (University of Miami, Florida)

Abstract

This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed. © 2006 by the Association for Public Policy Analysis and Management

Suggested Citation

  • David H. Greenberg & Charles Michalopoulos & Philip K. Robin, 2006. "Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(3), pages 523-552.
  • Handle: RePEc:wly:jpamgt:v:25:y:2006:i:3:p:523-552
    DOI: 10.1002/pam.20190
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/pam.20190
    File Function: Link to full text; subscription required
    Download Restriction: no

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. repec:mpr:mprres:3694 is not listed on IDEAS
    3. Steven Glazerman & Dan M. Levy & David Myers, "undated". "Nonexperimental Versus Experimental Estimates of Earnings Impacts," Mathematica Policy Research Reports 7c8bd68ac8db47caa57c70ee1, Mathematica Policy Research.
    4. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    5. Peter Z. Schochet & John Burghardt & Steven Glazerman, 2000. "National Job Corps Study: The Short-Term Impacts of Job Corps on Participants, Employment, and Related Outcomes," Mathematica Policy Research Reports 547380e5101c4fd88dd00bc4e, Mathematica Policy Research.
    6. Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
    7. Daniel Friedlander & David H. Greenberg & Philip K. Robins, 1997. "Evaluating Government Training Programs for the Economically Disadvantaged," Journal of Economic Literature, American Economic Association, vol. 35(4), pages 1809-1855, December.
    8. Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
    9. Charles Michalopoulos & Howard S. Bloom & Carolyn J. Hill, 2004. "Can Propensity-Score Methods Match the Findings from a Random Assignment Evaluation of Mandatory Welfare-to-Work Programs?," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 156-179, February.
    10. James J. Heckman & Hidehiko Ichimura & Petra Todd, 1998. "Matching As An Econometric Evaluation Estimator," Review of Economic Studies, Oxford University Press, vol. 65(2), pages 261-294.
    11. Robert S. Gay & Michael E. Borus, 1980. "Validating Performance Indicators for Employment and Training Programs," Journal of Human Resources, University of Wisconsin Press, vol. 15(1), pages 29-48.
    12. Howard S. Bloom, 1984. "Estimating the Effect of Job-Training Programs, Using Longitudinal Data: Ashenfelter's Findings Reconsidered," Journal of Human Resources, University of Wisconsin Press, vol. 19(4), pages 544-556.
    13. Kiefer, Nicholas M., 1978. "Federally subsidized occupational training and the employment and earnings of male trainees," Journal of Econometrics, Elsevier, vol. 8(1), pages 111-125, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
    2. Gary King & Emmanuela Gakidou & Nirmala Ravishankar & Ryan T. Moore & Jason Lakin & Manett Vargas & Martha María Téllez-Rojo & Juan Eugenio Hernández Ávila & Mauricio Hernández Ávila & Héctor Hernánde, 2007. "A “politically robust” experimental design for public policy evaluation, with application to the Mexican Universal Health Insurance program," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 479-506.
    3. Anders Stenberg & Olle Westerlund, 2015. "The long-term earnings consequences of general vs. specific training of the unemployed," IZA Journal of European Labor Studies, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-26, December.
    4. Carla Haelermans & Lex Borghans, 2012. "Wage Effects of On-the-Job Training: A Meta-Analysis," British Journal of Industrial Relations, London School of Economics, vol. 50(3), pages 502-528, September.
    5. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    6. Andersson, Fredrik & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    7. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    8. Fredrik Andersson & Harry J. Holzer & Julia I. Lane & David Rosenblum & Jeffrey Andrew Smith, 2016. "Does Federally-Funded Job Training Work? Non-experimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," CESifo Working Paper Series 6071, CESifo Group Munich.
    9. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jpamgt:v:25:y:2006:i:3:p:523-552. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley Content Delivery). General contact details of provider: http://www3.interscience.wiley.com/journal/34787/home .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.