IDEAS home Printed from https://ideas.repec.org/a/eee/econom/v198y2017i1p41-64.html
   My bibliography  Save this article

Evidence of randomisation bias in a large-scale social experiment: The case of ERA

Author

Listed:
  • Sianesi, Barbara

Abstract

We set out a theoretical framework for the systematic consideration of ‘randomisation bias’, estimate the causal impact of randomisation on participation patterns in an actual trial, and propose a non-experimental way of assessing the extent to which the experimental impacts are representative of the impacts that would have been experienced by the study sample that would have been obtained in the absence of random assignment. We also extend our estimator to deal with binary outcomes and to account for selective survey non-response, and explore partial and point identification of the parameter of interest under alternative assumptions on the selection process.

Suggested Citation

  • Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
  • Handle: RePEc:eee:econom:v:198:y:2017:i:1:p:41-64
    DOI: 10.1016/j.jeconom.2017.01.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S030440761730012X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jeconom.2017.01.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bell, Stephen H. & Orr, Larry L., 2002. "Screening (and creaming?) applicants to job training programs: the AFDC homemaker-home health aide demonstrations," Labour Economics, Elsevier, vol. 9(2), pages 279-301, April.
    2. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    3. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    4. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2008. "Nonparametric Tests for Treatment Effect Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 90(3), pages 389-405, August.
    5. Charles F. Manski, 1997. "Monotone Treatment Response," Econometrica, Econometric Society, vol. 65(6), pages 1311-1334, November.
    6. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 605-654.
    7. Martin Huber, 2012. "Identification of Average Treatment Effects in Social Experiments Under Alternative Forms of Attrition," Journal of Educational and Behavioral Statistics, , vol. 37(3), pages 443-474, June.
    8. Lechner, Michael, 2011. "The Estimation of Causal Effects by Difference-in-Difference Methods," Foundations and Trends(R) in Econometrics, now publishers, vol. 4(3), pages 165-224, November.
    9. Lorraine Dearden & Carl Emmerson & Costas Meghir, 2009. "Conditional Cash Transfers and School Dropout Rates," Journal of Human Resources, University of Wisconsin Press, vol. 44(4).
    10. Lechner, Michael & Smith, Jeffrey, 2007. "What is the value added by caseworkers?," Labour Economics, Elsevier, vol. 14(2), pages 135-151, April.
    11. Jeffrey M. Wooldridge, 2002. "Inverse probability weighted M-estimators for sample selection, attrition, and stratification," Portuguese Economic Journal, Springer;Instituto Superior de Economia e Gestao, vol. 1(2), pages 117-139, August.
    12. Card, David & Sullivan, Daniel G, 1988. "Measuring the Effect of Subsidized Training Programs on Movements in and out of Employment," Econometrica, Econometric Society, vol. 56(3), pages 497-530, May.
    13. Kreider, Brent & Pepper, John V., 2007. "Disability and Employment: Reevaluating the Evidence in Light of Reporting Errors," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 432-441, June.
    14. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    15. Howard S. Bloom, 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review, , vol. 8(2), pages 225-246, April.
    16. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    17. Richard Blundell & Monica Costa Dias & Costas Meghir & John Van Reenen, 2004. "Evaluating the Employment Impact of a Mandatory Job Search Program," Journal of the European Economic Association, MIT Press, vol. 2(4), pages 569-606, June.
    18. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    19. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    20. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    21. Thierry Kamionka & Guy Lacroix, 2008. "Assessing the External Validity of an Experimental Wage Subsidy," Annals of Economics and Statistics, GENES, issue 91-92, pages 357-384.
    22. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    23. Dubin, Jeffrey A. & Rivers, Douglas, 1993. "Experimental estimates of the impact of wage subsidies," Journal of Econometrics, Elsevier, vol. 56(1-2), pages 219-242, March.
    24. Lavergne, Pascal, 2001. "An equality test across nonparametric regressions," Journal of Econometrics, Elsevier, vol. 103(1-2), pages 307-344, July.
    25. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    26. Charles F. Manski & John V. Pepper, 2000. "Monotone Instrumental Variables, with an Application to the Returns to Schooling," Econometrica, Econometric Society, vol. 68(4), pages 997-1012, July.
    27. Guido W. Imbens & Charles F. Manski, 2004. "Confidence Intervals for Partially Identified Parameters," Econometrica, Econometric Society, vol. 72(6), pages 1845-1857, November.
    28. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    29. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    30. repec:adr:anecst:y:2008:i:91-92:p:16 is not listed on IDEAS
    31. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    32. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute of Labor Economics (IZA).
    33. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    34. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    35. James J. Heckman & Jeffrey A. Smith, 1998. "Evaluating the Welfare State," NBER Working Papers 6542, National Bureau of Economic Research, Inc.
    36. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    37. Charles F. Manski & John V. Pepper, 2009. "More on monotone instrumental variables," Econometrics Journal, Royal Economic Society, vol. 12(s1), pages 200-216, January.
    38. repec:adr:anecst:y:2008:i:91-92 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    2. Potash, Eric, 2018. "Randomization bias in field trials to evaluate targeting methods," Economics Letters, Elsevier, vol. 167(C), pages 131-135.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
    3. Michael Lechner, 2002. "Mikroökonometrische Evaluation arbeitsmarktpolitischer Massnahmen," University of St. Gallen Department of Economics working paper series 2002 2002-20, Department of Economics, University of St. Gallen.
    4. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    5. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    6. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    7. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    8. Aakvik, Arild & Heckman, James J. & Vytlacil, Edward J., 2005. "Estimating treatment effects for discrete outcomes when responses to treatment vary: an application to Norwegian vocational rehabilitation programs," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 15-51.
    9. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    10. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    11. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    12. Vishal Kamat, 2017. "Identifying the Effects of a Program Offer with an Application to Head Start," Papers 1711.02048, arXiv.org, revised Jul 2021.
    13. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    14. Martin Biewen & Bernd Fitzenberger & Aderonke Osikominu & Marie Paul, 2014. "The Effectiveness of Public-Sponsored Training Revisited: The Importance of Data and Methodological Choices," Journal of Labor Economics, University of Chicago Press, vol. 32(4), pages 837-897.
    15. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    16. Hector Chade & Gustavo Ventura, 1998. "Taxes and Marriage: A Two-Sided Search Analysis," UWO Department of Economics Working Papers 9819, University of Western Ontario, Department of Economics.
    17. Christelis, Dimitris & Dobrescu, Loretti I., 2020. "The causal effect of social activities on cognition: Evidence from 20 European countries," Social Science & Medicine, Elsevier, vol. 247(C).
    18. Arild Aakvik & James J. Heckman & Edward J. Vytlacil, 2000. "Treatment Effects for Discrete Outcomes when Responses to Treatment Vary Among Observationally Identical Persons: An Application to Norwegian ..," NBER Technical Working Papers 0262, National Bureau of Economic Research, Inc.
    19. Tsunao Okumura & Emiko Usui, 2014. "Concave‐monotone treatment response and monotone treatment selection: With an application to the returns to schooling," Quantitative Economics, Econometric Society, vol. 5, pages 175-194, March.
    20. Magnac, Thierry, 2000. "L'apport de la microéconométrie à l'évaluation des politiques publiques," Cahiers d'Economie et de Sociologie Rurales (CESR), Institut National de la Recherche Agronomique (INRA), vol. 54.

    More about this item

    Keywords

    Social experiments; Randomisation bias; Sample selection; Treatment effects; Matching methods; Reweighting estimators; Partial identification; External validity;
    All these keywords.

    JEL classification:

    • C14 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Semiparametric and Nonparametric Methods: General
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • J18 - Labor and Demographic Economics - - Demographic Economics - - - Public Policy
    • J38 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Public Policy

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:econom:v:198:y:2017:i:1:p:41-64. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.elsevier.com/locate/jeconom .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/jeconom .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.