IDEAS home Printed from https://ideas.repec.org/a/iza/izawol/journly2018n436.html
   My bibliography  Save this article

The usefulness of experiments

Author

Listed:
  • Jeffrey A. Smith

    (University of Wisconsin, and NBER, USA, and IZA, Germany)

Abstract

Non-experimental evaluations of programs compare individuals who choose to participate in a program to individuals who do not. Such comparisons run the risk of conflating non-random selection into the program with its causal effects. By randomly assigning individuals to participate in the program or not, experimental evaluations remove the potential for non-random selection to bias comparisons of participants and non-participants. In so doing, they provide compelling causal evidence of program effects. At the same time, experiments are not a panacea, and require careful design and interpretation.

Suggested Citation

  • Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
  • Handle: RePEc:iza:izawol:journl:y:2018:n:436
    as

    Download full text from publisher

    File URL: https://wol.iza.org/uploads/articles/436/pdfs/the-usefulness-of-experiments.pdf
    Download Restriction: no

    File URL: https://wol.iza.org/articles/the-usefulness-of-experiments
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Markus Frölich & Michael Lechner & Heidi Steiger, 2003. "Statistically Assisted Programme Selection - International Experiences and Potential Benefits for Switzerland," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 139(III), pages 311-331, September.
    2. Jeffrey Smith & Jeremy Lise & Shannon N. Seitz, 2003. "Equilibrium Policy Experiments And The Evaluation Of Social Programs," Working Paper 1012, Economics Department, Queen's University.
    3. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    4. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    5. Gary Burtless & Larry L. Orr, 1986. "Are Classical Experiments Needed for Manpower Policy," Journal of Human Resources, University of Wisconsin Press, vol. 21(4), pages 606-639.
    6. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    7. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    8. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    9. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    10. Dan A. Black & Jeffrey A. Smith & Mark C. Berger & Brett J. Noel, 2003. "Is the Threat of Reemployment Services More Effective Than the Services Themselves? Evidence from Random Assignment in the UI System," American Economic Review, American Economic Association, vol. 93(4), pages 1313-1327, September.
    11. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    12. Burt S. Barnow, 1987. "The Impact of CETA Programs on Earnings: A Review of the Literature," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 157-193.
    13. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jian Ming Luo & Chi Fung Lam & Hongyu Wang, 2021. "Exploring the Relationship Between Hedonism, Tourist Experience, and Revisit Intention in Entertainment Destination," SAGE Open, , vol. 11(4), pages 21582440211, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    2. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    5. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    6. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    7. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    8. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    9. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
    10. Christian Durán, 2004. "Evaluación microeconométrica de las políticas públicas de empleo: aspectos metodológicos," Hacienda Pública Española / Review of Public Economics, IEF, vol. 170(3), pages 107-133, september.
    11. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    12. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    13. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    14. Martin Biewen & Bernd Fitzenberger & Aderonke Osikominu & Marie Paul, 2014. "The Effectiveness of Public-Sponsored Training Revisited: The Importance of Data and Methodological Choices," Journal of Labor Economics, University of Chicago Press, vol. 32(4), pages 837-897.
    15. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    16. Miguel Angel Malo & Fernando Muñoz-Bullón, 2006. "Employment promotion measures and the quality of the job match for persons with disabilities," Hacienda Pública Española / Review of Public Economics, IEF, vol. 179(4), pages 79-111, September.
    17. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    18. Rothstein, Jesse & Von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    19. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    20. Michael Lechner, 2002. "Mikroökonometrische Evaluation arbeitsmarktpolitischer Massnahmen," University of St. Gallen Department of Economics working paper series 2002 2002-20, Department of Economics, University of St. Gallen.

    More about this item

    Keywords

    experiment; random assignment; causality; evaluation;
    All these keywords.

    JEL classification:

    • C52 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Evaluation, Validation, and Selection
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izawol:journl:y:2018:n:436. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Institute of Labor Economics (IZA) (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.