IDEAS home Printed from https://ideas.repec.org/a/eee/ecoedu/v6y1987i4p333-338.html
   My bibliography  Save this article

The case for evaluating training programs with randomized trials

Author

Listed:
  • Ashenfelter, Orley

Abstract

This brief paper presents the reasons that I have come to conclude that the evaluation of the economic benefits of training programs will be greatly enhanced by the use of classical experimental methods. In particular, I am convinced that some of these training programs should be operated so that control and experimental groups are selected by ran- dom assignment (randomized trials). It follows that a simple comparison of earnings, employment, and other outcomes as between control and experimental groups subsequent to participation in the experimental program will provide a simple and credible estimate of program success (or failure). The principal reason why randomized trials should be used in this field is that too much of the non-experimental estimation of the effects of training programs seems dependent on elements of model specification that cannot be subjected to powerful statistical tests. Moreover, these specification tests are merely necessary and not sufficient for the acceptability of a particular non-experimental estimation method, as an extensive example due to LaLonde demonstrates.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Ashenfelter, Orley, 1987. "The case for evaluating training programs with randomized trials," Economics of Education Review, Elsevier, vol. 6(4), pages 333-338, August.
  • Handle: RePEc:eee:ecoedu:v:6:y:1987:i:4:p:333-338
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/0272-7757(87)90016-1
    Download Restriction: Full text for ScienceDirect subscribers only

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    2. Schneider Hilmar & Zimmermann Klaus F. & Uhlendorff Arne, 2013. "Ökonometrie vs. Projektdesign: Lehren aus der Evaluation eines Modellprojekts zur Umsetzung des Workfare-Konzepts," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 233(1), pages 65-85, February.
    3. David Card & Jochen Kluve & Andrea Weber, 2010. "Active Labour Market Policy Evaluations: A Meta-Analysis," Economic Journal, Royal Economic Society, vol. 120(548), pages 452-477, November.
    4. repec:zbw:rwirep:0086 is not listed on IDEAS
    5. David Card & Jochen Kluve & Andrea Weber, 2009. "Active Labor Market Policy Evaluations – A Meta-analysis," Ruhr Economic Papers 0086, Rheinisch-Westfälisches Institut für Wirtschaftsforschung, Ruhr-Universität Bochum, Universität Dortmund, Universität Duisburg-Essen.
    6. Schneider, Hilmar & Uhlendorff, Arne & Zimmermann, Klaus F., 2010. "Mit Workfare aus der Sozialhilfe? Lehren aus einem Modellprojekt," IZA Standpunkte 33, Institute for the Study of Labor (IZA).
    7. Aistov, Andrey & Aleksandrova, Ekaterina, 2016. "Time-distributed difference-in-differences approach: The case of wage returns to training," Applied Econometrics, Publishing House "SINERGIA PRESS", vol. 43, pages 5-28.
    8. Arni, Patrick, 2012. "Kausale Evaluation von Pilotprojekten: Die Nutzung von Randomisierung in der Praxis," IZA Standpunkte 52, Institute for the Study of Labor (IZA).
    9. Charles Bellemare & Steeve Marchand & Bruce Shearer, 2016. "Structural Estimation and Experiments: Applications to Contracting Models," Journal of Institutional and Theoretical Economics (JITE), Mohr Siebeck, Tübingen, vol. 172(2), pages 342-363, June.
    10. Deborah Peikes & Sean Orzol & Lorenzo Moreno & Nora Paxton, "undated". "State Partnership Initiative: Selection of Comparison Groups for the Evaluation and Selected Impact Estimates," Mathematica Policy Research Reports f8760335b9ab4a39bdf2c3533, Mathematica Policy Research.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecoedu:v:6:y:1987:i:4:p:333-338. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu). General contact details of provider: http://www.elsevier.com/locate/econedurev .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.