IDEAS home Printed from https://ideas.repec.org/p/ucr/wpaper/202022.html
   My bibliography  Save this paper

Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials

Author

Listed:
  • Tarek Azzam

    ()

  • Michael Bates

    () (Department of Economics, University of California Riverside)

  • David Fairris

    ()

Abstract

Voluntary selection into experimental samples is ubiquitous and may lead researchers to question the external validity of experimental findings. We introduce tests for sample selection on unobserved variables to discern the generalizability of randomized control trials. We estimate the impact of a learning community on first-year college retention using an RCT, and employ our tests in this setting. Intent-to-treat and local-average-treatment-effect estimates reveal no discernable programmatic effects. Our tests reveal that the experimental sample is positively selected on unobserved characteristics suggesting limited external validity. Finally, we compare observational and experimental estimates, considering the internal and external validity of both approaches to reflect on within-study comparisons themselves.

Suggested Citation

  • Tarek Azzam & Michael Bates & David Fairris, 2020. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202022, University of California at Riverside, Department of Economics, revised Jul 2020.
  • Handle: RePEc:ucr:wpaper:202022
    as

    Download full text from publisher

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022.pdf
    File Function: First version, 2020
    Download Restriction: no

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022R.pdf
    File Function: Revised version, 2020
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Brian A. Jacob, 2004. "Public Housing, Housing Vouchers, and Student Achievement: Evidence from Public Housing Demolitions in Chicago," American Economic Review, American Economic Association, vol. 94(1), pages 233-258, March.
    3. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    4. Steven Glazerman & Daniel Mayer & Paul Decker, 2006. "Alternative routes to teaching: The impacts of Teach for America on student achievement and other outcomes," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(1), pages 75-96.
    5. James Heckman & Flavio Cunha, 2007. "The Technology of Skill Formation," American Economic Review, American Economic Association, vol. 97(2), pages 31-47, May.
    6. Bruce D. Meyer & Wallace K. C. Mok & James X. Sullivan, 2015. "Household Surveys in Crisis," Journal of Economic Perspectives, American Economic Association, vol. 29(4), pages 199-226, Fall.
    7. repec:mpr:mprres:4761 is not listed on IDEAS
    8. James J. Heckman & Edward Vytlacil, 2005. "Structural Equations, Treatment Effects, and Econometric Policy Evaluation," Econometrica, Econometric Society, vol. 73(3), pages 669-738, May.
    9. Raj Chetty & John N. Friedman & Nathaniel Hilger & Emmanuel Saez & Diane Whitmore Schanzenbach & Danny Yagan, 2011. "How Does Your Kindergarten Classroom Affect Your Earnings? Evidence from Project Star," The Quarterly Journal of Economics, Oxford University Press, vol. 126(4), pages 1593-1660.
    10. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, Oxford University Press, vol. 130(3), pages 1117-1165.
    11. Christian N. Brinch & Magne Mogstad & Matthew Wiswall, 2017. "Beyond LATE with a Discrete Instrument," Journal of Political Economy, University of Chicago Press, vol. 125(4), pages 985-1039.
    12. Marinho Bertanha & Guido W. Imbens, 2020. "External Validity in Fuzzy Regression Discontinuity Designs," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(3), pages 593-612, July.
    13. Philip Oreopoulos & Daniel Lang & Joshua Angrist, 2009. "Incentives and Services for College Achievement: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 136-163, January.
    14. Ernst Fehr & Lorenz Goette, 2007. "Do Workers Work More if Wages Are High? Evidence from a Randomized Field Experiment," American Economic Review, American Economic Association, vol. 97(1), pages 298-317, March.
    15. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    16. Huber, Martin, 2013. "A simple test for the ignorability of non-compliance in experiments," Economics Letters, Elsevier, vol. 120(3), pages 389-391.
    17. Paloyo, Alfredo R. & Rogan, Sally & Siminski, Peter, 2016. "The effect of supplemental instruction on academic performance: An encouragement design experiment," Economics of Education Review, Elsevier, vol. 55(C), pages 57-69.
    18. Flavio Cunha & James J. Heckman & Susanne M. Schennach, 2010. "Estimating the Technology of Cognitive and Noncognitive Skill Formation," Econometrica, Econometric Society, vol. 78(3), pages 883-931, May.
    19. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    20. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    21. Jeremy Lise & Shannon Seitz & Jeffrey Smith, 2015. "Evaluating search and matching models using experimental data," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-35, December.
    22. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    23. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 202010, University of California at Riverside, Department of Economics, revised Mar 2020.
    24. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    25. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    26. Sebastian Galiani & Patrick J. McEwan & Brian Quistorff, 2017. "External and Internal Validity of a Geographic Quasi-Experiment Embedded in a Cluster-Randomized Experiment," Advances in Econometrics, in: Matias D. Cattaneo & Juan Carlos Escanciano (ed.),Regression Discontinuity Designs, volume 38, pages 195-236, Emerald Publishing Ltd.
    27. Dean Karlan & Jonathan Zinman, 2009. "Observing Unobservables: Identifying Information Asymmetries With a Consumer Credit Field Experiment," Econometrica, Econometric Society, vol. 77(6), pages 1993-2008, November.
    28. David S. Lee, 2009. "Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects," Review of Economic Studies, Oxford University Press, vol. 76(3), pages 1071-1102.
    29. Krueger, Alan B & Whitmore, Diane M, 2001. "The Effect of Attending a Small Class in the Early Grades on College-Test Taking and Middle School Test Results: Evidence from Project STAR," Economic Journal, Royal Economic Society, vol. 111(468), pages 1-28, January.
    30. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    31. Erin Hartman & Richard Grieve & Roland Ramsahai & Jasjeet S. Sekhon, 2015. "From sample average treatment effect to population average treatment effect on the treated: combining experimental with observational studies to estimate population treatment effects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 178(3), pages 757-778, June.
    32. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Theory," Econometrica, Econometric Society, vol. 52(3), pages 681-700, May.
    33. Hausman, Jerry A & Wise, David A, 1979. "Attrition Bias in Experimental and Panel Data: The Gary Income Maintenance Experiment," Econometrica, Econometric Society, vol. 47(2), pages 455-473, March.
    34. Eric Chyn, 2018. "Moved to Opportunity: The Long-Run Effects of Public Housing Demolition on Children," American Economic Review, American Economic Association, vol. 108(10), pages 3028-3056, October.
    35. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Applications to Poisson Models," Econometrica, Econometric Society, vol. 52(3), pages 701-720, May.
    36. repec:mpr:mprres:3694 is not listed on IDEAS
    37. Steiner Peter M. & Kim Yongnam, 2016. "The Mechanics of Omitted Variable Bias: Bias Amplification and Cancellation of Offsetting Biases," Journal of Causal Inference, De Gruyter, vol. 4(2), pages 1, September.
    38. Sebastian Calónico & Jeffrey Smith, 2017. "The Women of the National Supported Work Demonstration," Journal of Labor Economics, University of Chicago Press, vol. 35(S1), pages 65-97.
    39. Russell, Lauren, 2017. "Can learning communities boost success of women and minorities in STEM? Evidence from the Massachusetts Institute of Technology," Economics of Education Review, Elsevier, vol. 61(C), pages 98-111.
    40. Raj Chetty & Nathaniel Hendren & Lawrence F. Katz, 2016. "The Effects of Exposure to Better Neighborhoods on Children: New Evidence from the Moving to Opportunity Experiment," American Economic Review, American Economic Association, vol. 106(4), pages 855-902, April.
    41. Julie Hotchkiss & Robert Moore & M. Melinda Pitts, 2006. "Freshman Learning Communities, College Performance, and Retention," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 197-210.
    42. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    43. Ghanem, Dalia & Hirshleifer, Sarojini & Ortiz-Becerra, Karen, 2019. "Testing Attrition Bias in Field Experiments," 2019 Annual Meeting, July 21-23, Atlanta, Georgia 291215, Agricultural and Applied Economics Association.
    44. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, December.
    45. Christopher R. Walters, 2018. "The Demand for Effective Charter Schools," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2179-2223.
    46. Wooldridge, Jeffrey M., 2014. "Quasi-maximum likelihood estimation and testing for nonlinear models with endogenous explanatory variables," Journal of Econometrics, Elsevier, vol. 182(1), pages 226-234.
    47. repec:hrv:faseco:30367426 is not listed on IDEAS
    48. Edward Vytlacil, 2002. "Independence, Monotonicity, and Latent Index Models: An Equivalence Result," Econometrica, Econometric Society, vol. 70(1), pages 331-341, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    2. Sarojini Hirshleifer & Dalia Ghanem & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 201919, University of California at Riverside, Department of Economics, revised Aug 2019.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucr:wpaper:202022. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Kelvin Mac). General contact details of provider: http://edirc.repec.org/data/deucrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.