IDEAS home Printed from https://ideas.repec.org/p/nbr/nberte/0107.html
   My bibliography  Save this paper

Randomization and Social Policy Evaluation Revisited

Author

Listed:
  • James J. Heckman

Abstract

This paper considers the evidence on the effectiveness and limitations of randomized controlled trials in economics. I revisit my previous paper "Randomization and Social Policy Evaluation" and update its message. I present a brief history of randomization in economics and identify two waves of enthusiasm for the method as "Two Awakenings" because of the near-religious zeal associated with both waves. I briefly summarize the lessons of the first wave and forecast the same lessons will be learned in the second wave.

Suggested Citation

  • James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberte:0107
    Note: LS
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/t0107.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Wu, De-Min, 1973. "Alternative Tests of Independence Between Stochastic Regressors and Disturbances," Econometrica, Econometric Society, vol. 41(4), pages 733-750, July.
    2. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    3. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 605-654.
    4. Richard Arnott & Joseph E. Stiglitz, 1988. "Randomization with Asymmetric Information," RAND Journal of Economics, The RAND Corporation, vol. 19(3), pages 344-362, Autumn.
    5. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 487-535.
    6. Manski, C.F., 1990. "The Selection Problem," Working papers 90-12, Wisconsin Madison - Social Systems.
    7. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, Oxford University Press, vol. 131(4), pages 1795-1848.
    8. James J. Heckman, 2008. "Econometric Causality," International Statistical Review, International Statistical Institute, vol. 76(1), pages 1-27, April.
    9. Heckman, James J, 1978. "Dummy Endogenous Variables in a Simultaneous Equation System," Econometrica, Econometric Society, vol. 46(4), pages 931-959, July.
    10. Hausman, Jerry, 2015. "Specification tests in econometrics," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 38(2), pages 112-134.
    11. Heckman, James J & Honore, Bo E, 1990. "The Empirical Content of the Roy Model," Econometrica, Econometric Society, vol. 58(5), pages 1121-1149, September.
    12. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    13. Ashenfelter, Orley & Card, David, 1985. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," The Review of Economics and Statistics, MIT Press, vol. 67(4), pages 648-660, November.
    14. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    15. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    16. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    17. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    18. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    19. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    20. Conlisk, John, 1973. "Choice of Response Functional Form in Designing Subsidy Experiments," Econometrica, Econometric Society, vol. 41(4), pages 643-656, July.
    21. James J. Heckman, 2008. "Causalidad econométrica," Monetaria, Centro de Estudios Monetarios Latinoamericanos, CEMLA, vol. 0(3), pages 291-338, julio-sep.
    22. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    2. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    5. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    6. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    7. Deborah A. Cobb‐Clark & Thomas Crossley, 2003. "Econometrics for Evaluations: An Introduction to Recent Developments," The Economic Record, The Economic Society of Australia, vol. 79(247), pages 491-511, December.
    8. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    9. Gordon B. Dahl, 2002. "Mobility and the Return to Education: Testing a Roy Model with Multiple Markets," Econometrica, Econometric Society, vol. 70(6), pages 2367-2420, November.
    10. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    11. James Heckman & Salvador Navarro-Lozano, 2004. "Using Matching, Instrumental Variables, and Control Functions to Estimate Economic Choice Models," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 30-57, February.
    12. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    13. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    14. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Institute for Research on Labor and Employment, Working Paper Series qt6605k20b, Institute of Industrial Relations, UC Berkeley.
    15. Hujer, Reinhard & Wellner, Marc, 2000. "The Effects of Public Sector Sponsored Training on Individual Employment Performance in East Germany," IZA Discussion Papers 141, Institute of Labor Economics (IZA).
    16. Arthur Lewbel, 2019. "The Identification Zoo: Meanings of Identification in Econometrics," Journal of Economic Literature, American Economic Association, vol. 57(4), pages 835-903, December.
    17. Aakvik, Arild & Heckman, James J. & Vytlacil, Edward J., 2005. "Estimating treatment effects for discrete outcomes when responses to treatment vary: an application to Norwegian vocational rehabilitation programs," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 15-51.
    18. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt6605k20b, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    19. Rothstein, J & von Wachter, T, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    20. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberte:0107. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.