IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp4162.html
   My bibliography  Save this paper

Sample Attrition Bias in Randomized Experiments: A Tale of Two Surveys

Author

Listed:
  • Behaghel, Luc

    (Paris School of Economics)

  • Crépon, Bruno

    (CREST)

  • Gurgand, Marc

    (Paris School of Economics)

  • Le Barbanchon, Thomas

    (Bocconi University)

Abstract

The randomized trial literature has helped to renew the fields of microeconometric policy evaluation by emphasizing identification issues raised by endogenous program participation. Measurement and attrition issues have perhaps received less attention. This paper analyzes the dramatic impact of sample attrition in a large job search experiment. We take advantage of two independent surveys on the same initial sample of 8, 000 persons. The first one is a long telephone survey that had a strikingly low and unbalanced response rate of about 50%. The second one is a combination of administrative data and a short telephone survey targeted at those leaving the unemployment registers; this enriched data source has a balanced and much higher response rate (about 80%). With naive estimates that neglect non responses, these two sources yield puzzlingly different results. Using the enriched administrative data as benchmark, we find evidence that estimates from the long telephone survey lack external and internal validity. We turn to existing methods to bound the effects in the presence of sample selection; we extend them to the context of randomization with imperfect compliance. The bounds obtained from the two surveys are compatible but those from the long telephone survey are somewhat uninformative. We conclude on the consequences for data collection strategies.

Suggested Citation

  • Behaghel, Luc & Crépon, Bruno & Gurgand, Marc & Le Barbanchon, Thomas, 2009. "Sample Attrition Bias in Randomized Experiments: A Tale of Two Surveys," IZA Discussion Papers 4162, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp4162
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp4162.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. David Card & Raj Chetty & Andrea Weber, 2007. "The Spike at Benefit Exhaustion: Leaving the Unemployment System or Starting a New Job?," American Economic Review, American Economic Association, vol. 97(2), pages 113-118, May.
    2. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    3. Ashenfelter, Orley & Ashmore, David & Deschenes, Olivier, 2005. "Do unemployment insurance recipients actively seek work? Evidence from randomized trials in four U.S. States," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 53-75.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Luc Behaghel & Bruno Cr?pon & Marc Gurgand, 2014. "Private and Public Provision of Counseling to Job Seekers: Evidence from a Large Controlled Experiment," American Economic Journal: Applied Economics, American Economic Association, vol. 6(4), pages 142-174, October.
    2. Dmitry Taubinsky & Alex Rees-Jones, 2018. "Attention Variation and Welfare: Theory and Evidence from a Tax Salience Experiment," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(4), pages 2462-2496.
    3. Glenn W. Harrison & Morten I. Lau & Hong Il Yoo, 2020. "Risk Attitudes, Sample Selection, and Attrition in a Longitudinal Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 552-568, July.
    4. Damon Jones & Aprajit Mahajan, 2015. "Time-Inconsistency and Saving: Experimental Evidence from Low-Income Tax Filers," NBER Working Papers 21272, National Bureau of Economic Research, Inc.
    5. Nadia Siddiqui & Vikki Boliver & Stephen Gorard, 2019. "Reliability of Longitudinal Social Surveys of Access to Higher Education: The Case of Next Steps in England," Social Inclusion, Cogitatio Press, vol. 7(1), pages 80-89.
    6. Markus Dertwinkel-Kalt & Katrin Köhler & Mirjam R. J. Lange & Tobias Wenzel, 2017. "Demand Shifts Due to Salience Effects: Experimental Evidence," Journal of the European Economic Association, European Economic Association, vol. 15(3), pages 626-653.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bart COCKX & Muriel DEJEMEPPE & Andrey LAUNOV & Bruno VAN DER LINDEN, 2011. "Monitoring, Sanctions and Front-Loading of Job Search in a Non-Stationary Model," LIDAM Discussion Papers IRES 2011042, Université catholique de Louvain, Institut de Recherches Economiques et Sociales (IRES).
    2. Bart Cockx & Muriel Dejemeppe, 2010. "The Threat of Monitoring Job Search. A Discontinuity Design," CESifo Working Paper Series 3267, CESifo.
    3. Petrongolo, Barbara, 2009. "The long-term effects of job search requirements: Evidence from the UK JSA reform," Journal of Public Economics, Elsevier, vol. 93(11-12), pages 1234-1253, December.
    4. Morescalchi Andrea & Paruolo Paolo, 2020. "Too Much Stick for the Carrot? Job Search Requirements and Search Behaviour of Unemployment Benefit Claimants," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 20(1), pages 1-21, January.
    5. Pathric Hägglund, 2014. "Experimental Evidence From Active Placement Efforts Among Unemployed in Sweden," Evaluation Review, , vol. 38(3), pages 191-216, June.
    6. María laura Alzúa & Guillermo Cruces & Carolina Lopez, 2016. "Long-Run Effects Of Youth Training Programs: Experimental Evidence From Argentina," Economic Inquiry, Western Economic Association International, vol. 54(4), pages 1839-1859, October.
    7. Fevang, Elisabeth & Hardoy, Inés & Røed, Knut, 2013. "Getting Disabled Workers Back to Work: How Important Are Economic Incentives?," IZA Discussion Papers 7137, Institute of Labor Economics (IZA).
    8. Patrick Arni & Rafael Lalive & Jan C. Van Ours, 2013. "How Effective Are Unemployment Benefit Sanctions? Looking Beyond Unemployment Exit," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 28(7), pages 1153-1178, November.
    9. Cairo, Sofie & Mahlstedt, Robert, 2021. "Transparency of the Welfare System and Labor Market Outcomes of Unemployed Workers," IZA Discussion Papers 14940, Institute of Labor Economics (IZA).
    10. Peter Z. Schochet & Ronald D'Amico & Jillian Berk & Sarah Dolfin & Nathan Wozny, "undated". "Estimated Impacts for Participants in the Trade Adjustment Assistance (TAA) Program Under the 2002 Amendments," Mathematica Policy Research Reports 582d8723f6884d4eb7a3f95a4, Mathematica Policy Research.
    11. Atonu Rabbani, 2017. "Can Leaders Promote Better Health Behavior? Learning from a Sanitation and Hygiene Communication Experiment in Rural Bangladesh," Working Papers id:11904, eSocialSciences.
    12. Martin, Will, 2021. "Tools for measuring the full impacts of agricultural interventions," IFPRI-MCC technical papers 2, International Food Policy Research Institute (IFPRI).
    13. Bennmarker, Helge & Skans, Oskar Nordström & Vikman, Ulrika, 2013. "Workfare for the old and long-term unemployed," Labour Economics, Elsevier, vol. 25(C), pages 25-34.
    14. Albanese, Andrea & Picchio, Matteo & Ghirelli, Corinna, 2020. "Timed to Say Goodbye: Does Unemployment Benefit Eligibility Affect Worker Layoffs?," Labour Economics, Elsevier, vol. 65(C).
    15. Deshpande, Ashwini & Desrochers, Alain & Ksoll, Christopher & Shonchoy, Abu S., 2017. "The Impact of a Computer-based Adult Literacy Program on Literacy and Numeracy: Evidence from India," World Development, Elsevier, vol. 96(C), pages 451-473.
    16. Sule Alan & Gyongyi Loranth, 2013. "Subprime Consumer Credit Demand: Evidence from a Lender's Pricing Experiment," The Review of Financial Studies, Society for Financial Studies, vol. 26(9), pages 2353-2374.
    17. Dincecco, Mark & Katz, Gabriel, 2012. "State Capacity and Long-Run Performance," MPRA Paper 38299, University Library of Munich, Germany.
    18. Karnani, Mohit, 2016. "Freshmen teachers and college major choice: Evidence from a random assignment in Chile," MPRA Paper 76062, University Library of Munich, Germany.
    19. Laura Abramovsky & Orazio Attanasio & Kai Barron & Pedro Carneiro & George Stoye, 2016. "Challenges to Promoting Social Inclusion of the Extreme Poor: Evidence from a Large-Scale Experiment in Colombia," Economía Journal, The Latin American and Caribbean Economic Association - LACEA, vol. 0(Spring 20), pages 89-141, April.
    20. Sriroop Chaudhuri & Mimi Roy & Louis M. McDonald & Yves Emendack, 2021. "Reflections on farmers’ social networks: a means for sustainable agricultural development?," Environment, Development and Sustainability: A Multidisciplinary Approach to the Theory and Practice of Sustainable Development, Springer, vol. 23(3), pages 2973-3008, March.

    More about this item

    Keywords

    randomized evaluation; survey non response; bounds;
    All these keywords.

    JEL classification:

    • C31 - Mathematical and Quantitative Methods - - Multiple or Simultaneous Equation Models; Multiple Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models; Quantile Regressions; Social Interaction Models
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • J68 - Labor and Demographic Economics - - Mobility, Unemployment, Vacancies, and Immigrant Workers - - - Public Policy

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp4162. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.