IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v184y2021i2p732-760.html

Missing, presumed different: Quantifying the risk of attrition bias in education evaluations

Author

Listed:
  • Ben Weidmann
  • Luke Miratrix

Abstract

We estimate the magnitude of attrition bias for 10 randomized controlled trials (RCTs) in education. We make use of a unique feature of administrative school data in England that allows us to analyse post‐test academic outcomes for nearly all students, including those who originally dropped out of the RCTs we analyse. We find that the typical magnitude of attrition bias is 0.015 effect size units (ES), with no estimate greater than 0.034 ES. This suggests that, in practice, the risk of attrition bias is limited. However, this risk should not be ignored as we find some evidence against the common ‘Missing At Random’ assumption. Attrition appears to be more problematic for treated units. We recommend that researchers incorporate uncertainty due to attrition bias, as well as performing sensitivity analyses based on the types of attrition mechanisms that are observed in practice.

Suggested Citation

  • Ben Weidmann & Luke Miratrix, 2021. "Missing, presumed different: Quantifying the risk of attrition bias in education evaluations," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 732-760, April.
  • Handle: RePEc:bla:jorssa:v:184:y:2021:i:2:p:732-760
    DOI: 10.1111/rssa.12677
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12677
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12677?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Harvey Goldstein & James R. Carpenter & William J. Browne, 2014. "Fitting multilevel multivariate models with missing data in responses and covariates that may include interactions and non-linear terms," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 177(2), pages 553-564, February.
    2. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    3. Rolf Sundberg, 2003. "Conditional statistical inference and quantification of relevance," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 65(1), pages 299-315, February.
    4. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
    5. David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
    6. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 202010, University of California at Riverside, Department of Economics, revised Mar 2020.
    7. Ghanem, Dalia & Hirshleifer, Sarojini & Ortiz-Becerra, Karen, "undated". "Testing Attrition Bias in Field Experiments," 2019 Annual Meeting, July 21-23, Atlanta, Georgia 291215, Agricultural and Applied Economics Association.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rafkin, Charlie & Shreekumar, Advik & Vautrey, Pierre-Luc, 2021. "When guidance changes: Government stances and public beliefs," Journal of Public Economics, Elsevier, vol. 196(C).
    2. Guigonan S. Adjognon & Daan van Soest & Jonas Guthoff, 2021. "Reducing Hunger with Payments for Environmental Services (PES): Experimental Evidence from Burkina Faso," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(3), pages 831-857, May.
    3. Kaitlin Anderson & Gema Zamarro & Jennifer Steele & Trey Miller, 2021. "Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations," Evaluation Review, , vol. 45(1-2), pages 70-104, February.
    4. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
    5. Annie Alcid & Erwin Bulte & Robert Lensink & Aussi Sayinzoga & Mark Treurniet, 2023. "Short- and Medium-term Impacts of Employability Training: Evidence from a Randomised Field Experiment in Rwanda," Journal of African Economies, Centre for the Study of African Economies, vol. 32(3), pages 296-328.
    6. Fulya Ersoy, 2021. "Returns to effort: experimental evidence from an online language platform," Experimental Economics, Springer;Economic Science Association, vol. 24(3), pages 1047-1073, September.
    7. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    8. Turner, Alex J. & Fichera, Eleonora & Sutton, Matt, 2021. "The effects of in-utero exposure to influenza on mental health and mortality risk throughout the life-course," Economics & Human Biology, Elsevier, vol. 43(C).
    9. Charles F. Manski & John V. Pepper, 2018. "How Do Right-to-Carry Laws Affect Crime Rates? Coping with Ambiguity Using Bounded-Variation Assumptions," The Review of Economics and Statistics, MIT Press, vol. 100(2), pages 232-244, May.
    10. Stefan Boes, 2013. "Nonparametric analysis of treatment effects in ordered response models," Empirical Economics, Springer, vol. 44(1), pages 81-109, February.
    11. Pablo Lavado & Gonzalo Rivera, 2016. "Identifying Treatment Effects with Data Combination and Unobserved Heterogeneity," Working Papers 79, Peruvian Economic Association.
    12. Wang, Xintong & Flores, Carlos A. & Flores-Lagunes, Alfonso, 2025. "The effects of Vietnam-era military service on the long-term health of veterans: A bounds analysis," Journal of Health Economics, Elsevier, vol. 101(C).
    13. Alberto Abadie & Guido W. Imbens, 2002. "Simple and Bias-Corrected Matching Estimators for Average Treatment Effects," NBER Technical Working Papers 0283, National Bureau of Economic Research, Inc.
    14. Giorgio Brunello & Dimitris Christelis & Anna Sanz‐de‐Galdeano & Anastasia Terskaya, 2024. "Does college selectivity reduce obesity? A partial identification approach," Health Economics, John Wiley & Sons, Ltd., vol. 33(10), pages 2306-2320, October.
    15. Monique De Haan & Edwin Leuven, 2020. "Head Start and the Distribution of Long-Term Education and Labor Market Outcomes," Journal of Labor Economics, University of Chicago Press, vol. 38(3), pages 727-765.
    16. Eric J. Tchetgen Tchetgen & Kathleen E. Wirth, 2017. "A general instrumental variable framework for regression analysis with outcome missing not at random," Biometrics, The International Biometric Society, vol. 73(4), pages 1123-1131, December.
    17. Santos, Andres, 2011. "Instrumental variable methods for recovering continuous linear functionals," Journal of Econometrics, Elsevier, vol. 161(2), pages 129-146, April.
    18. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Using Experiments to Correct for Selection in Observational Studies," Papers 2006.09676, arXiv.org, revised May 2025.
    19. Norbert Schady & Jere Behrman & Maria Caridad Araujo & Rodrigo Azuero & Raquel Bernal & David Bravo & Florencia Lopez-Boo & Karen Macours & Daniela Marshall & Christina Paxson & Renos Vakis, 2015. "Wealth Gradients in Early Childhood Cognitive Development in Five Latin American Countries," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 446-463.
    20. McGovern, Mark E. & Canning, David & Bärnighausen, Till, 2018. "Accounting for non-response bias using participation incentives and survey design: An application using gift vouchers," Economics Letters, Elsevier, vol. 171(C), pages 239-244.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:184:y:2021:i:2:p:732-760. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.