IDEAS home Printed from https://ideas.repec.org/p/ifs/ifsewp/14-10.html
   My bibliography  Save this paper

Dealing with randomisation bias in a social experiment: the case of ERA

Author

Listed:
  • Barbara Sianesi

    (Institute for Fiscal Studies and Institute for Fiscal Studies)

Abstract

One of the most powerful critiques of the use of randomised experiments in the social sciences is the possibility that individuals might react to the randomisation itself, thereby rendering the causal inference from the experiment irrelevant for policy purposes. In this paper we set out a theoretical framework for the systematic consideration of “randomisation bias”, and provide what is to our knowledge the first empirical evidence on this form of bias in an actual social experiment, the UK Employment Retention and Advancement (ERA) study. Specifically, we empirically test the extent to which random assignment has affected the process of participation in the ERA study. We further propose a non-experimental way of assessing the extent to which the treatment effects stemming from the experimental sample are representative of the impacts that would have been experienced by the population who would have been exposed to the program in routine mode. We consider both the case of administrative outcome measures available for the entire relevant sample and of survey-based outcome measures. For the case of survey outcomes we extend our estimators to also account for selective non-response based on observed characteristics. Both for the case of administrative and survey data we further extend our proposed estimators to deal with the nonlinear case of binary outcomes.

Suggested Citation

  • Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
  • Handle: RePEc:ifs:ifsewp:14/10
    as

    Download full text from publisher

    File URL: https://www.ifs.org.uk/uploads/publications/wps/wp201410.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bell, Stephen H. & Orr, Larry L., 2002. "Screening (and creaming?) applicants to job training programs: the AFDC homemaker-home health aide demonstrations," Labour Economics, Elsevier, vol. 9(2), pages 279-301, April.
    2. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    3. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    4. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    5. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    6. Lechner, Michael & Smith, Jeffrey, 2007. "What is the value added by caseworkers?," Labour Economics, Elsevier, vol. 14(2), pages 135-151, April.
    7. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    8. Richard Blundell & Monica Costa Dias & Costas Meghir & John Van Reenen, 2004. "Evaluating the Employment Impact of a Mandatory Job Search Program," Journal of the European Economic Association, MIT Press, vol. 2(4), pages 569-606, June.
    9. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    10. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    11. James J. Heckman & Jeffrey A. Smith, 1998. "Evaluating the Welfare State," NBER Working Papers 6542, National Bureau of Economic Research, Inc.
    12. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    13. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    14. Martin Huber, 2012. "Identification of Average Treatment Effects in Social Experiments Under Alternative Forms of Attrition," Journal of Educational and Behavioral Statistics, , vol. 37(3), pages 443-474, June.
    15. Card, David & Sullivan, Daniel G, 1988. "Measuring the Effect of Subsidized Training Programs on Movements in and out of Employment," Econometrica, Econometric Society, vol. 56(3), pages 497-530, May.
    16. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    17. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    18. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    2. Jeremy Lise & Shannon Seitz & Jeffrey Smith, 2015. "Evaluating search and matching models using experimental data," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-35, December.
    3. Barbara Sianesi, 2016. "‘Randomisation bias’ in the medical literature: a review," IFS Working Papers W16/23, Institute for Fiscal Studies.
    4. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 49(3), pages 871-905, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    2. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    5. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    6. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    7. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    8. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    9. Martin Biewen & Bernd Fitzenberger & Aderonke Osikominu & Marie Paul, 2014. "The Effectiveness of Public-Sponsored Training Revisited: The Importance of Data and Methodological Choices," Journal of Labor Economics, University of Chicago Press, vol. 32(4), pages 837-897.
    10. Aakvik, Arild & Heckman, James J. & Vytlacil, Edward J., 2005. "Estimating treatment effects for discrete outcomes when responses to treatment vary: an application to Norwegian vocational rehabilitation programs," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 15-51.
    11. Michael Lechner, 2002. "Mikroökonometrische Evaluation arbeitsmarktpolitischer Massnahmen," University of St. Gallen Department of Economics working paper series 2002 2002-20, Department of Economics, University of St. Gallen.
    12. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    13. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    14. Marco Caliendo & Sabine Kopeinig, 2008. "Some Practical Guidance For The Implementation Of Propensity Score Matching," Journal of Economic Surveys, Wiley Blackwell, vol. 22(1), pages 31-72, February.
    15. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    16. Miguel Angel Malo & Fernando Muñoz-Bullón, 2006. "Employment promotion measures and the quality of the job match for persons with disabilities," Hacienda Pública Española / Review of Public Economics, IEF, vol. 179(4), pages 79-111, September.
    17. James J. Heckman & Jeffrey A. Smith, 2004. "The Determinants of Participation in a Social Program: Evidence from a Prototypical Job Training Program," Journal of Labor Economics, University of Chicago Press, vol. 22(2), pages 243-298, April.
    18. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    19. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    20. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ifs:ifsewp:14/10. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Emma Hyman (email available below). General contact details of provider: https://edirc.repec.org/data/ifsssuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.