IDEAS home Printed from https://ideas.repec.org/p/ucr/wpaper/202022.html
   My bibliography  Save this paper

Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials

Author

Listed:
  • Tarek Azzam
  • Michael Bates

    (Department of Economics, University of California Riverside)

  • David Fairris

Abstract

Voluntary selection into experimental samples is ubiquitous and may lead researchers to question the external validity of experimental findings. We introduce tests for sample selection on unobserved variables to discern the generalizability of randomized control trials. We estimate the impact of a learning community on first-year college retention using an RCT, and employ our tests in this setting. Intent-to-treat and local-average-treatment-effect estimates reveal no discernable programmatic effects. Our tests reveal that the experimental sample is positively selected on unobserved characteristics suggesting limited external validity. Finally, we compare observational and experimental estimates, considering the internal and external validity of both approaches to reflect on within-study comparisons themselves.

Suggested Citation

  • Tarek Azzam & Michael Bates & David Fairris, 2020. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202022, University of California at Riverside, Department of Economics, revised Jul 2020.
  • Handle: RePEc:ucr:wpaper:202022
    as

    Download full text from publisher

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022.pdf
    File Function: First version, 2020
    Download Restriction: no

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022R.pdf
    File Function: Revised version, 2020
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Brian A. Jacob, 2004. "Public Housing, Housing Vouchers, and Student Achievement: Evidence from Public Housing Demolitions in Chicago," American Economic Review, American Economic Association, vol. 94(1), pages 233-258, March.
    3. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Theory," Econometrica, Econometric Society, vol. 52(3), pages 681-700, May.
    4. Hausman, Jerry A & Wise, David A, 1979. "Attrition Bias in Experimental and Panel Data: The Gary Income Maintenance Experiment," Econometrica, Econometric Society, vol. 47(2), pages 455-473, March.
    5. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    6. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    7. Eric Chyn, 2018. "Moved to Opportunity: The Long-Run Effects of Public Housing Demolition on Children," American Economic Review, American Economic Association, vol. 108(10), pages 3028-3056, October.
    8. Steven Glazerman & Daniel Mayer & Paul Decker, 2006. "Alternative routes to teaching: The impacts of Teach for America on student achievement and other outcomes," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(1), pages 75-96.
    9. Jeremy Lise & Shannon Seitz & Jeffrey Smith, 2015. "Evaluating search and matching models using experimental data," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-35, December.
    10. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    11. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Applications to Poisson Models," Econometrica, Econometric Society, vol. 52(3), pages 701-720, May.
    12. Flavio Cunha & James J. Heckman & Susanne M. Schennach, 2010. "Estimating the Technology of Cognitive and Noncognitive Skill Formation," Econometrica, Econometric Society, vol. 78(3), pages 883-931, May.
    13. repec:mpr:mprres:3694 is not listed on IDEAS
    14. Dean Karlan & Jonathan Zinman, 2009. "Observing Unobservables: Identifying Information Asymmetries With a Consumer Credit Field Experiment," Econometrica, Econometric Society, vol. 77(6), pages 1993-2008, November.
    15. James Heckman & Flavio Cunha, 2007. "The Technology of Skill Formation," American Economic Review, American Economic Association, vol. 97(2), pages 31-47, May.
    16. Steiner Peter M. & Kim Yongnam, 2016. "The Mechanics of Omitted Variable Bias: Bias Amplification and Cancellation of Offsetting Biases," Journal of Causal Inference, De Gruyter, vol. 4(2), pages 1, September.
    17. Huber, Martin, 2013. "A simple test for the ignorability of non-compliance in experiments," Economics Letters, Elsevier, vol. 120(3), pages 389-391.
    18. Bruce D. Meyer & Wallace K. C. Mok & James X. Sullivan, 2015. "Household Surveys in Crisis," Journal of Economic Perspectives, American Economic Association, vol. 29(4), pages 199-226, Fall.
    19. repec:mpr:mprres:4761 is not listed on IDEAS
    20. James J. Heckman & Edward Vytlacil, 2005. "Structural Equations, Treatment Effects, and Econometric Policy Evaluation," Econometrica, Econometric Society, vol. 73(3), pages 669-738, May.
    21. Christian N. Brinch & Magne Mogstad & Matthew Wiswall, 2017. "Beyond LATE with a Discrete Instrument," Journal of Political Economy, University of Chicago Press, vol. 125(4), pages 985-1039.
    22. Krueger, Alan B & Whitmore, Diane M, 2001. "The Effect of Attending a Small Class in the Early Grades on College-Test Taking and Middle School Test Results: Evidence from Project STAR," Economic Journal, Royal Economic Society, vol. 111(468), pages 1-28, January.
    23. Sebastian Calónico & Jeffrey Smith, 2017. "The Women of the National Supported Work Demonstration," Journal of Labor Economics, University of Chicago Press, vol. 35(S1), pages 65-97.
    24. Raj Chetty & John N. Friedman & Nathaniel Hilger & Emmanuel Saez & Diane Whitmore Schanzenbach & Danny Yagan, 2011. "How Does Your Kindergarten Classroom Affect Your Earnings? Evidence from Project Star," The Quarterly Journal of Economics, Oxford University Press, vol. 126(4), pages 1593-1660.
    25. Sarojini Hirshleifer & Dalia Ghanem & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 201919, University of California at Riverside, Department of Economics, revised Aug 2019.
    26. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    27. Sebastian Galiani & Patrick J. McEwan & Brian Quistorff, 2017. "External and Internal Validity of a Geographic Quasi-Experiment Embedded in a Cluster-Randomized Experiment," Advances in Econometrics, in: Matias D. Cattaneo & Juan Carlos Escanciano (ed.), Regression Discontinuity Designs, volume 38, pages 195-236, Emerald Publishing Ltd.
    28. Christopher R. Walters, 2018. "The Demand for Effective Charter Schools," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2179-2223.
    29. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, Oxford University Press, vol. 130(3), pages 1117-1165.
    30. Marinho Bertanha & Guido W. Imbens, 2020. "External Validity in Fuzzy Regression Discontinuity Designs," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(3), pages 593-612, July.
    31. Russell, Lauren, 2017. "Can learning communities boost success of women and minorities in STEM? Evidence from the Massachusetts Institute of Technology," Economics of Education Review, Elsevier, vol. 61(C), pages 98-111.
    32. Philip Oreopoulos & Daniel Lang & Joshua Angrist, 2009. "Incentives and Services for College Achievement: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 136-163, January.
    33. Ernst Fehr & Lorenz Goette, 2007. "Do Workers Work More if Wages Are High? Evidence from a Randomized Field Experiment," American Economic Review, American Economic Association, vol. 97(1), pages 298-317, March.
    34. Raj Chetty & Nathaniel Hendren & Lawrence F. Katz, 2016. "The Effects of Exposure to Better Neighborhoods on Children: New Evidence from the Moving to Opportunity Experiment," American Economic Review, American Economic Association, vol. 106(4), pages 855-902, April.
    35. Julie Hotchkiss & Robert Moore & M. Melinda Pitts, 2006. "Freshman Learning Communities, College Performance, and Retention," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 197-210.
    36. Paloyo, Alfredo R. & Rogan, Sally & Siminski, Peter, 2016. "The effect of supplemental instruction on academic performance: An encouragement design experiment," Economics of Education Review, Elsevier, vol. 55(C), pages 57-69.
    37. Onur Altindag & Theodore J. Joyce & Julie A. Reeder, 2015. "Effects of Peer Counseling to Support Breastfeeding: Assessing the External Validity of a Randomized Field Experiment," NBER Working Papers 21013, National Bureau of Economic Research, Inc.
    38. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    39. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    40. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    41. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    42. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    43. David S. Lee, 2009. "Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects," Review of Economic Studies, Oxford University Press, vol. 76(3), pages 1071-1102.
    44. Ghanem, Dalia & Hirshleifer, Sarojini & Ortiz-Becerra, Karen, 2019. "Testing Attrition Bias in Field Experiments," 2019 Annual Meeting, July 21-23, Atlanta, Georgia 291215, Agricultural and Applied Economics Association.
    45. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, November.
    46. Wooldridge, Jeffrey M., 2014. "Quasi-maximum likelihood estimation and testing for nonlinear models with endogenous explanatory variables," Journal of Econometrics, Elsevier, vol. 182(1), pages 226-234.
    47. repec:hrv:faseco:30367426 is not listed on IDEAS
    48. Edward Vytlacil, 2002. "Independence, Monotonicity, and Latent Index Models: An Equivalence Result," Econometrica, Econometric Society, vol. 70(1), pages 331-341, January.
    49. Erin Hartman & Richard Grieve & Roland Ramsahai & Jasjeet S. Sekhon, 2015. "From sample average treatment effect to population average treatment effect on the treated: combining experimental with observational studies to estimate population treatment effects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 178(3), pages 757-778, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    2. Sarojini Hirshleifer & Dalia Ghanem & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 201919, University of California at Riverside, Department of Economics, revised Aug 2019.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Possebom, Vitor, 2018. "Sharp bounds on the MTE with sample selection," MPRA Paper 89785, University Library of Munich, Germany.
    2. Huber, Martin & Wüthrich, Kaspar, 2017. "Evaluating local average and quantile treatment effects under endogeneity based on instruments: a review," FSES Working Papers 479, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    5. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    6. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    7. Dionissi Aliprantis, 2011. "Assessing the evidence on neighborhood effects from moving to opportunity," Working Papers (Old Series) 1101, Federal Reserve Bank of Cleveland.
    8. Committee, Nobel Prize, 2021. "Answering causal questions using observational data," Nobel Prize in Economics documents 2021-2, Nobel Prize Committee.
    9. Dalla-Zuanna, Antonio & Liu, Kai, 2019. "Understanding Program Complementarities: Estimating the Dynamic Effects of a Training Program with Multiple Alternatives," IZA Discussion Papers 12839, Institute of Labor Economics (IZA).
    10. Sebastian Galiani & Juan Pantano, 2021. "Structural Models: Inception and Frontier," NBER Working Papers 28698, National Bureau of Economic Research, Inc.
    11. Joshua D. Angrist & Sarah R. Cohodes & Susan M. Dynarski & Parag A. Pathak & Christopher R. Walters, 2016. "Stand and Deliver: Effects of Boston's Charter High Schools on College Preparation, Entry, and Choice," Journal of Labor Economics, University of Chicago Press, vol. 34(2), pages 275-318.
    12. Huber Martin & Wüthrich Kaspar, 2019. "Local Average and Quantile Treatment Effects Under Endogeneity: A Review," Journal of Econometric Methods, De Gruyter, vol. 8(1), pages 1-27, January.
    13. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    14. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    15. Cornelissen, Thomas & Dustmann, Christian & Raute, Anna & Schönberg, Uta, 2016. "From LATE to MTE: Alternative methods for the evaluation of policy interventions," Labour Economics, Elsevier, vol. 41(C), pages 47-60.
    16. Denni Tommasi & Arthur Lewbel & Rossella Calvi, 2017. "LATE with Mismeasured or Misspecified Treatment: An application to Women's Empowerment in India," Working Papers ECARES ECARES 2017-27, ULB -- Universite Libre de Bruxelles.
    17. Heckman, James J. & Humphries, John Eric & Veramendi, Gregory, 2016. "Dynamic treatment effects," Journal of Econometrics, Elsevier, vol. 191(2), pages 276-292.
    18. Magne Mogstad & Andres Santos & Alexander Torgovitsky, 2018. "Using Instrumental Variables for Inference About Policy Relevant Treatment Parameters," Econometrica, Econometric Society, vol. 86(5), pages 1589-1619, September.
    19. Blaise Melly und Kaspar Wüthrich, 2016. "Local quantile treatment effects," Diskussionsschriften dp1605, Universitaet Bern, Departement Volkswirtschaft.
    20. Amanda E. Kowalski, 2018. "Behavior within a Clinical Trial and Implications for Mammography Guidelines," NBER Working Papers 25049, National Bureau of Economic Research, Inc.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucr:wpaper:202022. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/deucrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kelvin Mac (email available below). General contact details of provider: https://edirc.repec.org/data/deucrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.