IDEAS home Printed from https://ideas.repec.org/a/aea/jecper/v9y1995i2p85-110.html
   My bibliography  Save this article

Assessing the Case for Social Experiments

Author

Listed:
  • James J. Heckman
  • Jeffrey A. Smith

Abstract

This paper analyzes the method of social experiments. The assumptions that justify the experimental method are exposited. Parameters of interest in evaluating social programs are discussed. The authors show how experiments sometimes serve as instrumental variables to identify program impacts. The most favorable case for experiments ignores variability across persons in response to treatments received and assumes that mean impacts of a program are the main object of interest in conducting an evaluation. Experiments do not identify the distribution of program gains unless additional assumptions are maintained. Evidence on the validity of the assumptions used to justify social experiments is presented.

Suggested Citation

  • James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
  • Handle: RePEc:aea:jecper:v:9:y:1995:i:2:p:85-110
    Note: DOI: 10.1257/jep.9.2.85
    as

    Download full text from publisher

    File URL: http://www.aeaweb.org/articles.php?doi=10.1257/jep.9.2.85
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Angrist, J.D. & Imbens, G.W., 1991. "Sources of Identifying Information in Evaluation Models," Harvard Institute of Economic Research Working Papers 1568, Harvard - Institute of Economic Research.
    3. James Heckman & Jeffrey Smith & Christopher Taber, 1994. "Accounting for Dropouts in Evaluations of Social Experiments," NBER Technical Working Papers 0166, National Bureau of Economic Research, Inc.
    4. Kydland, Finn E & Prescott, Edward C, 1991. " The Econometrics of the General Equilibrium Approach to Business Cycles," Scandinavian Journal of Economics, Wiley Blackwell, vol. 93(2), pages 161-178.
    5. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    6. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    7. Gary Burtless & Larry L. Orr, 1986. "Are Classical Experiments Needed for Manpower Policy," Journal of Human Resources, University of Wisconsin Press, vol. 21(4), pages 606-639.
    8. Heckman, James J, 1990. "Varieties of Selection Bias," American Economic Review, American Economic Association, vol. 80(2), pages 313-318, May.
    9. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    2. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    3. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    4. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-10, University of Miami, Department of Economics.
    5. Battistin, Erich & Rettore, Enrico, 2008. "Ineligibles and eligible non-participants as a double comparison group in regression-discontinuity designs," Journal of Econometrics, Elsevier, vol. 142(2), pages 715-730, February.
    6. Erich Battistin & Enrico Rettore, 2003. "Another look at the regression discontinuity design," CeMMAP working papers CWP01/03, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    7. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    8. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    9. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    11. Ichimura, Hidehiko & Todd, Petra E., 2007. "Implementing Nonparametric and Semiparametric Estimators," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 74, Elsevier.
    12. Timothy J. Bartik, 1999. "Federal Policy Toward State and Local Economic Development in the 1990s," Book chapters authored by Upjohn Institute researchers, in: RD Norton (ed.),The Millennial City: Classic Readings on U.S. Urban Policy, volume 12, pages 235-251, W.E. Upjohn Institute for Employment Research.
    13. Hidehiko Ichimura & Christopher R. Taber, 2000. "Direct Estimation of Policy Impacts," NBER Technical Working Papers 0254, National Bureau of Economic Research, Inc.
    14. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    15. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    16. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    17. Regner, Hakan, 2002. "A nonexperimental evaluation of training programs for the unemployed in Sweden," Labour Economics, Elsevier, vol. 9(2), pages 187-206, April.
    18. Metcalf, Charles E., 1997. "The Advantages of Experimental Designs for Evaluating Sex Education Programs," Children and Youth Services Review, Elsevier, vol. 19(7), pages 507-523, November.
    19. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    20. Dehejia Rajeev, 2015. "Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 47-69, June.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aea:jecper:v:9:y:1995:i:2:p:85-110. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Michael P. Albert (email available below). General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.