IDEAS home Printed from https://ideas.repec.org/p/ifs/ifsewp/13-15.html
   My bibliography  Save this paper

Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA

Author

Listed:
  • Barbara Sianesi

    (Institute for Fiscal Studies and Institute for Fiscal Studies)

Abstract

We highlight the importance of randomisation bias, a situation where the process of participation in a social experiment has been affected by randomisation per se. We illustrate how this has happened in the case of the UK Employment Retention and Advancement (ERA) experiment, in which over one quarter of the eligible population was not represented. Our objective is to quantify the impact that the ERA eligible population would have experienced under ERA, and to assess how this impact relates to the experimental impact estimated on the potentially selected subgroup of study participants. We show that the typical matching assumption required to identify the average treatment effect of interest is made up of two parts. One part remains testable under the experiment even in the presence of randomisation bias, and offers a way to correct the non-experimental estimates should they fail to pass the test. The other part rests on what we argue is a very weak assumption, at least in the case of ERA. We implement these ideas to the ERA program and show the power of this strategy. Further exploiting the experiment we assess the validity in our application of the claim often made in the literature that knowledge of long and detailed labour market histories can control for most selection bias in the evaluation of labour market interventions. Finally, for the case of survey-based outcomes, we develop a reweighting estimator which takes account of both non-participation and non-response.

Suggested Citation

  • Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
  • Handle: RePEc:ifs:ifsewp:13/15
    as

    Download full text from publisher

    File URL: http://www.ifs.org.uk/wps/wp201315.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    2. Markus Frölich, 2004. "Programme Evaluation with Multiple Treatments," Journal of Economic Surveys, Wiley Blackwell, vol. 18(2), pages 181-224, April.
    3. Lechner, Michael & Smith, Jeffrey, 2007. "What is the value added by caseworkers?," Labour Economics, Elsevier, vol. 14(2), pages 135-151, April.
    4. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    5. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    6. Martin Huber, 2012. "Identification of Average Treatment Effects in Social Experiments Under Alternative Forms of Attrition," Journal of Educational and Behavioral Statistics, , vol. 37(3), pages 443-474, June.
    7. Card, David & Sullivan, Daniel G, 1988. "Measuring the Effect of Subsidized Training Programs on Movements in and out of Employment," Econometrica, Econometric Society, vol. 56(3), pages 497-530, May.
    8. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    9. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    10. Dubin, Jeffrey A. & Rivers, Douglas, 1993. "Experimental estimates of the impact of wage subsidies," Journal of Econometrics, Elsevier, vol. 56(1-2), pages 219-242, March.
    11. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    12. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    13. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    14. Martin Huber, 2010. "Identification of average treatment effects in social experiments under different forms of attrition," University of St. Gallen Department of Economics working paper series 2010 2010-22, Department of Economics, University of St. Gallen.
    15. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    16. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    2. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    5. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    6. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    7. Miguel Angel Malo & Fernando Muñoz-Bullón, 2006. "Employment promotion measures and the quality of the job match for persons with disabilities," Hacienda Pública Española / Review of Public Economics, IEF, vol. 179(4), pages 79-111, September.
    8. James J. Heckman & Jeffrey A. Smith, 2004. "The Determinants of Participation in a Social Program: Evidence from a Prototypical Job Training Program," Journal of Labor Economics, University of Chicago Press, vol. 22(2), pages 243-298, April.
    9. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    10. Malo, Miguel A., 2005. "Job matching quality effects of employment promotion measures for people with disabilities," DEE - Working Papers. Business Economics. WB wb055315, Universidad Carlos III de Madrid. Departamento de Economía de la Empresa.
    11. Christian Durán, 2004. "Evaluación microeconométrica de las políticas públicas de empleo: aspectos metodológicos," Hacienda Pública Española / Review of Public Economics, IEF, vol. 170(3), pages 107-133, september.
    12. Cerqua, Augusto & Urwin, Peter & Thomson, Dave & Bibby, David, 2020. "Evaluation of education and training impacts for the unemployed: Challenges of new data," Labour Economics, Elsevier, vol. 67(C).
    13. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
    14. Flores-Lagunes, Alfonso & Gonzalez, Arturo & Neumann, Todd C., 2005. "Learning but Not Earning? The Value of Job Corps Training for Hispanic Youths," IZA Discussion Papers 1638, Institute of Labor Economics (IZA).
    15. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    16. Alfonso Flores‐Lagunes & Arturo Gonzalez & Todd Neumann, 2010. "Learning But Not Earning? The Impact Of Job Corps Training On Hispanic Youth," Economic Inquiry, Western Economic Association International, vol. 48(3), pages 651-667, July.
    17. Michael Lechner, 2002. "Mikroökonometrische Evaluation arbeitsmarktpolitischer Massnahmen," University of St. Gallen Department of Economics working paper series 2002 2002-20, Department of Economics, University of St. Gallen.
    18. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    19. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    20. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.

    More about this item

    Keywords

    social experiments; sample selection; treatment effects; matching methods; reweighting estimators;
    All these keywords.

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • J18 - Labor and Demographic Economics - - Demographic Economics - - - Public Policy
    • J38 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Public Policy

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ifs:ifsewp:13/15. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Emma Hyman (email available below). General contact details of provider: https://edirc.repec.org/data/ifsssuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.