IDEAS home Printed from https://ideas.repec.org/a/wly/jpamgt/v25y2006i3p523-552.html
   My bibliography  Save this article

Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?

Author

Listed:
  • David H. Greenberg

    (University of Maryland, Baltimore County)

  • Charles Michalopoulos

    (MDRC, New York City)

  • Philip K. Robin

    (University of Miami, Florida)

Abstract

This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed. © 2006 by the Association for Public Policy Analysis and Management

Suggested Citation

  • David H. Greenberg & Charles Michalopoulos & Philip K. Robin, 2006. "Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(3), pages 523-552.
  • Handle: RePEc:wly:jpamgt:v:25:y:2006:i:3:p:523-552
    DOI: 10.1002/pam.20190
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/pam.20190
    File Function: Link to full text; subscription required
    Download Restriction: no

    File URL: https://libkey.io/10.1002/pam.20190?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. repec:mpr:mprres:3694 is not listed on IDEAS
    3. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    4. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    5. Peter Z. Schochet & John Burghardt & Steven Glazerman, 2000. "National Job Corps Study: The Short-Term Impacts of Job Corps on Participants, Employment, and Related Outcomes," Mathematica Policy Research Reports 547380e5101c4fd88dd00bc4e, Mathematica Policy Research.
    6. Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
    7. David Greenberg & Robert Meyer & Charles Michalopoulos & Michael Wiseman, 2003. "Explaining Variation in the Effects of Welfare-To-Work Programs," Evaluation Review, , vol. 27(4), pages 359-394, August.
    8. Amy Zambrowski & Anne Gordon, "undated". "Evaluation of the Minority Female Single Parent Demonstration: Fifth-Year Impacts at CET," Mathematica Policy Research Reports 743e24a57e9c4d98b4b47b681, Mathematica Policy Research.
    9. Daniel Friedlander & David H. Greenberg & Philip K. Robins, 1997. "Evaluating Government Training Programs for the Economically Disadvantaged," Journal of Economic Literature, American Economic Association, vol. 35(4), pages 1809-1855, December.
    10. Bassi, Laurie J, 1984. "Estimating the Effect of Training Programs with Non-Random Selection," The Review of Economics and Statistics, MIT Press, vol. 66(1), pages 36-43, February.
    11. repec:mpr:mprres:2428 is not listed on IDEAS
    12. repec:mpr:mprres:2737 is not listed on IDEAS
    13. Charles Michalopoulos & Howard S. Bloom & Carolyn J. Hill, 2004. "Can Propensity-Score Methods Match the Findings from a Random Assignment Evaluation of Mandatory Welfare-to-Work Programs?," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 156-179, February.
    14. repec:mpr:mprres:2670 is not listed on IDEAS
    15. James J. Heckman & Hidehiko Ichimura & Petra Todd, 1998. "Matching As An Econometric Evaluation Estimator," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 65(2), pages 261-294.
    16. Robert S. Gay & Michael E. Borus, 1980. "Validating Performance Indicators for Employment and Training Programs," Journal of Human Resources, University of Wisconsin Press, vol. 15(1), pages 29-48.
    17. Howard S. Bloom, 1984. "Estimating the Effect of Job-Training Programs, Using Longitudinal Data: Ashenfelter's Findings Reconsidered," Journal of Human Resources, University of Wisconsin Press, vol. 19(4), pages 544-556.
    18. Kiefer, Nicholas M., 1978. "Federally subsidized occupational training and the employment and earnings of male trainees," Journal of Econometrics, Elsevier, vol. 8(1), pages 111-125, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    2. Anders Stenberg & Olle Westerlund, 2015. "The long-term earnings consequences of general vs. specific training of the unemployed," IZA Journal of European Labor Studies, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-26, December.
    3. Carla Haelermans & Lex Borghans, 2012. "Wage Effects of On-the-Job Training: A Meta-Analysis," British Journal of Industrial Relations, London School of Economics, vol. 50(3), pages 502-528, September.
    4. Oh, Sehun & DiNitto, Diana M. & Powers, Daniel A., 2020. "A longitudinal evaluation of government-sponsored job skills training and basic employment services among U.S. baby boomers with economic disadvantages," Evaluation and Program Planning, Elsevier, vol. 82(C).
    5. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    6. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    7. Fredrik Andersson & Harry J. Holzer & Julia I. Lane & David Rosenblum & Jeffrey Smith, 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," NBER Working Papers 19446, National Bureau of Economic Research, Inc.
    8. Carolyn J. Heinrich, 2008. "Advancing public sector performance analysis," Applied Stochastic Models in Business and Industry, John Wiley & Sons, vol. 24(5), pages 373-389, September.
    9. Reynold V. Galope, 2016. "A Different Certification Effect of the Small Business Innovation Research (SBIR) Program," Economic Development Quarterly, , vol. 30(4), pages 371-383, November.
    10. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
    11. Gary King & Emmanuela Gakidou & Nirmala Ravishankar & Ryan T. Moore & Jason Lakin & Manett Vargas & Martha María Téllez-Rojo & Juan Eugenio Hernández Ávila & Mauricio Hernández Ávila & Héctor Hernánde, 2007. "A “politically robust” experimental design for public policy evaluation, with application to the Mexican Universal Health Insurance program," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 479-506.
    12. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. O'Higgins, Niall, 2001. "Youth unemployment and employment policy: a global perspective," MPRA Paper 23698, University Library of Munich, Germany.
    2. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    3. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    4. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    5. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    6. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    7. Orley Ashenfelter & David E. Bloom & Gordon B. Dahl, 2013. "Lawyers as Agents of the Devil in a Prisoner's Dilemma Game," Journal of Empirical Legal Studies, John Wiley & Sons, vol. 10(3), pages 399-423, September.
    8. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    9. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    10. Cansino Muñoz-Repiso, José Manuel & Sánchez Braza, Antonio, 2011. "Effectiveness of Public Training Programs Reducing the Time Needed to Find a Job/Eficacia de los programas públicos de formación en la reducción del tiempo necesario para encontrar un empleo," Estudios de Economia Aplicada, Estudios de Economia Aplicada, vol. 29, pages 391(26á.)-3, Abril.
    11. Helena Holmlund & Olmo Silva, 2014. "Targeting Noncognitive Skills to Improve Cognitive Outcomes: Evidence from a Remedial Education Intervention," Journal of Human Capital, University of Chicago Press, vol. 8(2), pages 126-160.
    12. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    13. Barros, Ricardo Paes de, 2010. "The Impact of Social Interventions: Nonparametric Identification from Choice-Based Samples," Brazilian Review of Econometrics, Sociedade Brasileira de Econometria - SBE, vol. 30(2), December.
    14. David Card & Pablo Ibarrarán & Ferdinando Regalia & David Rosas-Shady & Yuri Soares, 2011. "The Labor Market Impacts of Youth Training in the Dominican Republic," Journal of Labor Economics, University of Chicago Press, vol. 29(2), pages 267-300.
    15. Janice Tripney & Jorge Hombrados & Mark Newman & Kimberly Hovish & Chris Brown & Katarzyna Steinka‐Fry & Eric Wilkey, 2013. "Technical and Vocational Education and Training (TVET) Interventions to Improve the Employability and Employment of Young People in Low‐ and Middle‐Income Countries: A Systematic Review," Campbell Systematic Reviews, John Wiley & Sons, vol. 9(1), pages 1-171.
    16. Robert LaLonde & Daniel Sullivan, 2010. "Vocational Training," NBER Chapters, in: Targeting Investments in Children: Fighting Poverty When Resources Are Limited, pages 323-349, National Bureau of Economic Research, Inc.
    17. Cockx, Bart & Van der Linden, Bruno & Karaa, Adel, 1998. "Active Labour Market Policies and Job Tenure," Oxford Economic Papers, Oxford University Press, vol. 50(4), pages 685-708, October.
    18. Raaum, Oddbjørn & Torp, Hege & Zhang, Tao, 2003. "Do individual programme effects exceed the costs? Norwegian evidence on long run effects of labour market training," Memorandum 15/2002, Oslo University, Department of Economics.
    19. Chabé-Ferret, Sylvain, 2017. "Should We Combine Difference In Differences with Conditioning on Pre-Treatment Outcomes?," TSE Working Papers 17-824, Toulouse School of Economics (TSE).
    20. Tommaso Nannicini, 2007. "Simulation-based sensitivity analysis for matching estimators," Stata Journal, StataCorp LP, vol. 7(3), pages 334-350, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jpamgt:v:25:y:2006:i:3:p:523-552. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www3.interscience.wiley.com/journal/34787/home .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.