IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v184y2021i2p732-760.html
   My bibliography  Save this article

Missing, presumed different: Quantifying the risk of attrition bias in education evaluations

Author

Listed:
  • Ben Weidmann
  • Luke Miratrix

Abstract

We estimate the magnitude of attrition bias for 10 randomized controlled trials (RCTs) in education. We make use of a unique feature of administrative school data in England that allows us to analyse post‐test academic outcomes for nearly all students, including those who originally dropped out of the RCTs we analyse. We find that the typical magnitude of attrition bias is 0.015 effect size units (ES), with no estimate greater than 0.034 ES. This suggests that, in practice, the risk of attrition bias is limited. However, this risk should not be ignored as we find some evidence against the common ‘Missing At Random’ assumption. Attrition appears to be more problematic for treated units. We recommend that researchers incorporate uncertainty due to attrition bias, as well as performing sensitivity analyses based on the types of attrition mechanisms that are observed in practice.

Suggested Citation

  • Ben Weidmann & Luke Miratrix, 2021. "Missing, presumed different: Quantifying the risk of attrition bias in education evaluations," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 732-760, April.
  • Handle: RePEc:bla:jorssa:v:184:y:2021:i:2:p:732-760
    DOI: 10.1111/rssa.12677
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12677
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12677?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    2. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing Attrition Bias in Field Experiments," Working Papers 202218, University of California at Riverside, Department of Economics, revised Oct 2022.
    3. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
    4. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 202010, University of California at Riverside, Department of Economics, revised Mar 2020.
    5. Harvey Goldstein & James R. Carpenter & William J. Browne, 2014. "Fitting multilevel multivariate models with missing data in responses and covariates that may include interactions and non-linear terms," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 177(2), pages 553-564, February.
    6. Rolf Sundberg, 2003. "Conditional statistical inference and quantification of relevance," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 65(1), pages 299-315, February.
    7. David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rafkin, Charlie & Shreekumar, Advik & Vautrey, Pierre-Luc, 2021. "When guidance changes: Government stances and public beliefs," Journal of Public Economics, Elsevier, vol. 196(C).
    2. Guigonan S. Adjognon & Daan van Soest & Jonas Guthoff, 2021. "Reducing Hunger with Payments for Environmental Services (PES): Experimental Evidence from Burkina Faso," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(3), pages 831-857, May.
    3. Kaitlin Anderson & Gema Zamarro & Jennifer Steele & Trey Miller, 2021. "Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations," Evaluation Review, , vol. 45(1-2), pages 70-104, February.
    4. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
    5. Annie Alcid & Erwin Bulte & Robert Lensink & Aussi Sayinzoga & Mark Treurniet, 2023. "Short- and Medium-term Impacts of Employability Training: Evidence from a Randomised Field Experiment in Rwanda," Journal of African Economies, Centre for the Study of African Economies, vol. 32(3), pages 296-328.
    6. Fulya Ersoy, 2021. "Returns to effort: experimental evidence from an online language platform," Experimental Economics, Springer;Economic Science Association, vol. 24(3), pages 1047-1073, September.
    7. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    8. Victor Aguirregabiria, 2006. "Another Look at the Identification of Dynamic Discrete Decision Processes: With an Application to Retirement Behavior," 2006 Meeting Papers 169, Society for Economic Dynamics.
    9. Turner, Alex J. & Fichera, Eleonora & Sutton, Matt, 2021. "The effects of in-utero exposure to influenza on mental health and mortality risk throughout the life-course," Economics & Human Biology, Elsevier, vol. 43(C).
    10. Kjetil Bjorvatn & Alexander W. Cappelen & Linda Helgesson Sekei & Erik Ø. Sørensen & Bertil Tungodden, 2020. "Teaching Through Television: Experimental Evidence on Entrepreneurship Education in Tanzania," Management Science, INFORMS, vol. 66(6), pages 2308-2325, June.
    11. Card, David & Rothstein, Jesse, 2007. "Racial segregation and the black-white test score gap," Journal of Public Economics, Elsevier, vol. 91(11-12), pages 2158-2184, December.
    12. Charles F. Manski & John V. Pepper, 2018. "How Do Right-to-Carry Laws Affect Crime Rates? Coping with Ambiguity Using Bounded-Variation Assumptions," The Review of Economics and Statistics, MIT Press, vol. 100(2), pages 232-244, May.
    13. Stefan Boes, 2013. "Nonparametric analysis of treatment effects in ordered response models," Empirical Economics, Springer, vol. 44(1), pages 81-109, February.
    14. Pablo Lavado & Gonzalo Rivera, 2016. "Identifying Treatment Effects with Data Combination and Unobserved Heterogeneity," Working Papers 79, Peruvian Economic Association.
    15. Sascha O. Becker & Marco Caliendo, 2007. "Sensitivity analysis for average treatment effects," Stata Journal, StataCorp LP, vol. 7(1), pages 71-83, February.
    16. Xintong Wang & Carlos A. Flores & Alfonso Flores-Lagunes, 2020. "The Effects of Vietnam-Era Military Service on the Long-Term Health of Veterans: A Bounds Analysis," Center for Policy Research Working Papers 234, Center for Policy Research, Maxwell School, Syracuse University.
    17. Andrew Chesher & Adam M. Rosen, 2021. "Counterfactual Worlds," Annals of Economics and Statistics, GENES, issue 142, pages 311-335.
    18. Manski, Charles, 1994. "Simultaneity with Downward Sloping Demand," SFB 373 Discussion Papers 1994,29, Humboldt University of Berlin, Interdisciplinary Research Project 373: Quantification and Simulation of Economic Processes.
    19. Joshua D. Angrist, 2004. "Treatment effect heterogeneity in theory and practice," Economic Journal, Royal Economic Society, vol. 114(494), pages 52-83, March.
    20. Arnaud Chevalier & Gauthier Lanot, 2004. "Monotonicity and the Roy Model," Manchester School, University of Manchester, vol. 72(4), pages 560-567, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:184:y:2021:i:2:p:732-760. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.