IDEAS home Printed from https://ideas.repec.org/p/fip/fedacd/2019-01.html
   My bibliography  Save this paper

What Works at Scale? A Framework to Scale Up Workforce Development Programs

Author

Listed:
  • Alexander Ruder

Abstract

Workforce development policymakers have access to a growing catalog of training programs evaluated with rigorous randomized controlled trials. This evidence base identifies programs that work in specific geographic and temporal contexts but may not necessarily work in other contexts or at a scale sufficient to meet regional workforce needs. The author examines a sample of recent randomized controlled trials of workforce development programs and reports to what extent this body of evidence informs policymakers about what works at scale. The author finds that most programs are implemented at a small scale, use nonrandom samples from the population of interest, and are concentrated in the most populous urban areas and U.S. states. The author then discusses a method to help state and local policymakers, technical colleges, training providers, and other workforce development organizations adopt evidence-based policies in their local contexts and at scale. The two-step method includes a check on the assumptions in a program's theory of change and an assessment of the sensitivity of projected results to violations in assumptions such as program completion rates. The author provides an example of the method applied to a hypothetical metropolitan area that seeks to adopt an evidence-based training program for youth with barriers to employment.

Suggested Citation

  • Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-1, Federal Reserve Bank of Atlanta.
  • Handle: RePEc:fip:fedacd:2019-01
    DOI: 10.29338/dp2019-01
    as

    Download full text from publisher

    File URL: https://www.frbatlanta.org/-/media/documents/community-development/publications/discussion-papers/2019/01-what-works-at-scale-a-framework-to-scale-up-workforce-development-programs-2019-06-21.pdf
    File Function: Full text
    Download Restriction: no

    File URL: https://libkey.io/10.29338/dp2019-01?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Robert B. Olsen & Stephen H. Bell & Austin Nichols, 2018. "Using Preferred Applicant Random Assignment (PARA) to Reduce Randomization Bias in Randomized Trials of Discretionary Programs," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 37(1), pages 167-180, January.
    3. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    4. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    5. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    6. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    7. Amanda E. Kowalski, 2018. "How to Examine External Validity Within an Experiment," NBER Working Papers 24834, National Bureau of Economic Research, Inc.
    8. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    2. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    3. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Corduneanu-Huci, Cristina & Dorsch, Michael T. & Maarek, Paul, 2021. "The politics of experimentation: Political competition and randomized controlled trials," Journal of Comparative Economics, Elsevier, vol. 49(1), pages 1-21.
    6. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    7. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    8. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    9. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    10. Sutherland, Alex & Ariel, Barak & Farrar, William & De Anda, Randy, 2017. "Post-experimental follow-ups—Fade-out versus persistence effects: The Rialto police body-worn camera experiment four years on," Journal of Criminal Justice, Elsevier, vol. 53(C), pages 110-116.
    11. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    12. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    13. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    14. Donald Moynihan, 2018. "A great schism approaching? Towards a micro and macro public administration," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 1(1).
    15. Williams, Martin J., 2020. "Beyond ‘context matters’: Context and external validity in impact evaluation," World Development, Elsevier, vol. 127(C).
    16. Andrew Dustan & Juan Manuel Hernandez-Agramonte & Stanislao Maldonado, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale," Natural Field Experiments 00664, The Field Experiments Website.
    17. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    18. Andor, Mark A. & Gerster, Andreas & Peters, Jörg & Schmidt, Christoph M., 2020. "Social Norms and Energy Conservation Beyond the US," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    19. Potash, Eric, 2018. "Randomization bias in field trials to evaluate targeting methods," Economics Letters, Elsevier, vol. 167(C), pages 131-135.
    20. Dustan, Andrew & Hernandez-Agramonte, Juan Manuel & Maldonado, Stanislao, 2023. "Motivating bureaucrats with behavioral insights when state capacity is weak: Evidence from large-scale field experiments in Peru," Journal of Development Economics, Elsevier, vol. 160(C).

    More about this item

    Keywords

    workforce development; human capital; skills; provision and effects of welfare programs;
    All these keywords.

    JEL classification:

    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs
    • J08 - Labor and Demographic Economics - - General - - - Labor Economics Policies
    • J24 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Human Capital; Skills; Occupational Choice; Labor Productivity

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:fip:fedacd:2019-01. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Rob Sarwark (email available below). General contact details of provider: https://edirc.repec.org/data/frbatus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.