IDEAS home Printed from https://ideas.repec.org/p/hka/wpaper/2020-001.html
   My bibliography  Save this paper

Randomization and Social Policy Evaluation Revisited

Author

Listed:
  • James J. Heckman

    (University of Chicago)

Abstract

This paper examines the case for randomized controlled trials in economics. I revisit my previous paper--"Randomization and Social Policy Evaluation"--and update its message. I present a brief summary of the history of randomization in economics. I identify two waves of enthusiasm for the method as "Two Awakenings" because of the near-religious zeal associated with each wave. The First Wave substantially contributed to the development of microeconometrics because of the flawed nature of the experimental evidence. The Second Wave has improved experimental designs to avoid some of the technical statistical issues identified by econometricians in the wake of the First Wave. However, the deep conceptual issues about parameters estimated, and the economic interpretation and the policy relevance of the experimental results have not been addressed in the Second Wave.

Suggested Citation

  • James J. Heckman, 2020. "Randomization and Social Policy Evaluation Revisited," Working Papers 2020-001, Human Capital and Economic Opportunity Working Group.
  • Handle: RePEc:hka:wpaper:2020-001
    Note: ECI
    as

    Download full text from publisher

    File URL: http://humcap.uchicago.edu/RePEc/hka/wpaper/Heckman_2019_randomization-social-policy-evaluation-revisited.pdf
    File Function: First version, December 9, 2019
    Download Restriction: no

    File URL: http://humcap.uchicago.edu/RePEc/hka/wpaper/Heckman_2020_randomization-soc-pol-revisited.pdf
    File Function: Second version, December 12, 2019
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Wu, De-Min, 1973. "Alternative Tests of Independence Between Stochastic Regressors and Disturbances," Econometrica, Econometric Society, vol. 41(4), pages 733-750, July.
    2. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    3. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    4. Richard Arnott & Joseph E. Stiglitz, 1988. "Randomization with Asymmetric Information," RAND Journal of Economics, The RAND Corporation, vol. 19(3), pages 344-362, Autumn.
    5. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 487-535.
    6. Manski, C.F., 1990. "The Selection Problem," Working papers 90-12, Wisconsin Madison - Social Systems.
    7. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(4), pages 1795-1848.
    8. James J. Heckman, 2008. "Econometric Causality," International Statistical Review, International Statistical Institute, vol. 76(1), pages 1-27, April.
    9. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    10. Heckman, James J, 1978. "Dummy Endogenous Variables in a Simultaneous Equation System," Econometrica, Econometric Society, vol. 46(4), pages 931-959, July.
    11. Hausman, Jerry, 2015. "Specification tests in econometrics," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 38(2), pages 112-134.
    12. Heckman, James J & Honore, Bo E, 1990. "The Empirical Content of the Roy Model," Econometrica, Econometric Society, vol. 58(5), pages 1121-1149, September.
    13. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    14. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part I: Causal Models, Structural Models and Econometric Policy Evaluation," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 70, Elsevier.
    15. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    16. Ashenfelter, Orley & Card, David, 1985. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," The Review of Economics and Statistics, MIT Press, vol. 67(4), pages 648-660, November.
    17. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    18. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    19. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    20. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    21. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    22. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    23. Conlisk, John, 1973. "Choice of Response Functional Form in Designing Subsidy Experiments," Econometrica, Econometric Society, vol. 41(4), pages 643-656, July.
    24. James J. Heckman, 2008. "Causalidad econométrica," Monetaria, CEMLA, vol. 0(3), pages 291-338, julio-sep.
    25. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    2. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    5. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    6. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    7. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    8. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    9. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    10. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    11. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    12. Deborah A. Cobb‐Clark & Thomas Crossley, 2003. "Econometrics for Evaluations: An Introduction to Recent Developments," The Economic Record, The Economic Society of Australia, vol. 79(247), pages 491-511, December.
    13. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    14. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    15. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    16. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    17. Heckman, James, 2001. "Accounting for Heterogeneity, Diversity and General Equilibrium in Evaluating Social Programmes," Economic Journal, Royal Economic Society, vol. 111(475), pages 654-699, November.
    18. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    19. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    20. Rothstein, Jesse & Von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.

    More about this item

    Keywords

    field experiments; randomized control trials;

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hka:wpaper:2020-001. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jennifer Pachon (email available below). General contact details of provider: https://edirc.repec.org/data/mfichus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.