IDEAS home Printed from https://ideas.repec.org/a/iab/iabzaf/v43i2p091-105.html
   My bibliography  Save this article

Setting up social experiments: the good, the bad, and the ugly

Author

Listed:
  • Barnow, Burt S.

    (George Washington University, Trachtenberg School of Public Policy and Public Administration, Washington, DC, USA)

Abstract

"It is widely agreed that randomized controlled trials - social experiments - are the gold standard for evaluating social programs. There are, however, many important issues that cannot be tested using social experiments, and often things go wrong when conducting social experiments. This paper explores these issues and offers suggestions on ways to deal with commonly encountered problems. Social experiments are preferred because random assignment assures that any differences between the treatment and control groups are due to the intervention and not some other factor; also, the results of social experiments are more easily explained and accepted by policy officials. Experimental evaluations often lack external validity and cannot control for entry effects, scale and general equilibrium effects, and aspects of the intervention that were not randomly assigned. Experiments can also lead to biased impact estimates if the control group changes its behavior or if changing the number selected changes the impact. Other problems with conducting social experiments include increased time and cost, and legal and ethical issues related to excluding people from the treatment. Things that sometimes go wrong in social experiments include programs cheating on random assignment, and participants and/or staff not understanding the intervention rules. The random assignment evaluation of the Job Training Partnership Act in the United States is used as a case study to illustrate the issues." (Author's abstract, IAB-Doku) ((en))

Suggested Citation

  • Barnow, Burt S., 2010. "Setting up social experiments: the good, the bad, and the ugly," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 43(2), pages 91-105.
  • Handle: RePEc:iab:iabzaf:v:43:i:2:p:091-105
    DOI: 10.1007/s12651-010-0042-6
    as

    Download full text from publisher

    File URL: https://doi.org/10.1007/s12651-010-0042-6
    Download Restriction: no

    File URL: https://libkey.io/10.1007/s12651-010-0042-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jeffrey Smith & Jeremy Lise & Shannon N. Seitz, 2003. "Equilibrium Policy Experiments And The Evaluation Of Social Programs," Working Paper 1012, Economics Department, Queen's University.
    2. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    3. Burt S. Barnow, 2005. "The ethics of federal social program evaluation: A response to Jan Blustein," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 24(4), pages 846-848.
    4. Joshua D. Angrist & Alan B. Keueger, 1991. "Does Compulsory School Attendance Affect Schooling and Earnings?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 106(4), pages 979-1014.
    5. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Brussig Martin & Schwarzkopf Manuela, 2012. "Eingliederungsgutscheine: Zwischen Empowerment und Stigmatisierung / Re-employment bonuses as vouchers: Between empowerment and stigmatization," Arbeit, De Gruyter, vol. 21(1), pages 39-51, March.
    2. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 49(3), pages 871-905, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Barnow, Burt S., 2010. "Setting up social experiments: the good, the bad, and the ugly," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 43(2), pages 91-105.
    2. Barnow, Burt S., 2010. "Setting up social experiments: the good, the bad, and the ugly," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 43(2), pages 91-105.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    5. repec:pri:cepsud:87krueger is not listed on IDEAS
    6. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    7. Dan A. Black & Lars Skipper & Jeffrey A. Smith & Jeffrey Andrew Smith, 2023. "Firm Training," CESifo Working Paper Series 10268, CESifo.
    8. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    9. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    10. Betcherman, Gordon & Olivas, Karina & Dar, Amit, 2004. "Impacts of active labor market programs : new evidence from evaluations with particular attention to developing and transition countries," Social Protection Discussion Papers and Notes 29142, The World Bank.
    11. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    12. Alan Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    13. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    14. Matthias Doepke & Fabrizio Zilibotti, 2005. "The Macroeconomics of Child Labor Regulation," American Economic Review, American Economic Association, vol. 95(5), pages 1492-1524, December.
    15. Eckstein, Zvi & Zilcha, Itzhak, 1994. "The effects of compulsory schooling on growth, income distribution and welfare," Journal of Public Economics, Elsevier, vol. 54(3), pages 339-359, July.
    16. G. Bellettini & C. Berti Ceroni, 2000. "Compulsory schooling laws and the cure against child labor," Working Papers 394, Dipartimento Scienze Economiche, Universita' di Bologna.
    17. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    18. Sonja Fagernäs, 2011. "Protection through Proof of Age. Birth Registration and Child Labor in Early 20th Century USA," Working Paper Series 2311, Department of Economics, University of Sussex Business School.
    19. Battaglia, Marianna & Lebedinski, Lara, 2015. "Equal Access to Education: An Evaluation of the Roma Teaching Assistant Program in Serbia," World Development, Elsevier, vol. 76(C), pages 62-81.
    20. Meng, Xin & Zhao, Guochang, 2021. "The long shadow of a large scale education interruption: The intergenerational effect," Labour Economics, Elsevier, vol. 71(C).
    21. Riddell, Chris & Riddell, W. Craig, 2016. "When Can Experimental Evidence Mislead? A Re-Assessment of Canada's Self Sufficiency Project," IZA Discussion Papers 9939, Institute of Labor Economics (IZA).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iab:iabzaf:v:43:i:2:p:091-105. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: IAB, Geschäftsbereich Wissenschaftliche Fachinformation und Bibliothek (email available below). General contact details of provider: https://edirc.repec.org/data/iabbbde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.