IDEAS home Printed from https://ideas.repec.org/p/ucr/wpaper/202022.html
   My bibliography  Save this paper

Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials

Author

Listed:
  • Tarek Azzam
  • Michael Bates

    (Department of Economics, University of California Riverside)

  • David Fairris

Abstract

Voluntary selection into experimental samples is ubiquitous and may lead researchers to question the external validity of experimental findings. We introduce tests for sample selection on unobserved variables to discern the generalizability of randomized control trials. We estimate the impact of a learning community on first-year college retention using an RCT, and employ our tests in this setting. Intent-to-treat and local-average-treatment-effect estimates reveal no discernable programmatic effects. Our tests reveal that the experimental sample is positively selected on unobserved characteristics suggesting limited external validity. Finally, we compare observational and experimental estimates, considering the internal and external validity of both approaches to reflect on within-study comparisons themselves.

Suggested Citation

  • Tarek Azzam & Michael Bates & David Fairris, 2020. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202022, University of California at Riverside, Department of Economics, revised Jul 2020.
  • Handle: RePEc:ucr:wpaper:202022
    as

    Download full text from publisher

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022.pdf
    File Function: First version, 2020
    Download Restriction: no

    File URL: https://economics.ucr.edu/repec/ucr/wpaper/202022R.pdf
    File Function: Revised version, 2020
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Brian A. Jacob, 2004. "Public Housing, Housing Vouchers, and Student Achievement: Evidence from Public Housing Demolitions in Chicago," American Economic Review, American Economic Association, vol. 94(1), pages 233-258, March.
    3. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    4. Steven Glazerman & Daniel Mayer & Paul Decker, 2006. "Alternative routes to teaching: The impacts of Teach for America on student achievement and other outcomes," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(1), pages 75-96.
    5. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    6. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing Attrition Bias in Field Experiments," Working Papers 202218, University of California at Riverside, Department of Economics, revised Oct 2022.
    7. James Heckman & Flavio Cunha, 2007. "The Technology of Skill Formation," American Economic Review, American Economic Association, vol. 97(2), pages 31-47, May.
    8. Amanda Kowalski, 2016. "Doing more when you're running LATE: Applying marginal treatment effect methods to examine treatment effect heterogeneity in experiments," Artefactual Field Experiments 00560, The Field Experiments Website.
    9. Bruce D. Meyer & Wallace K. C. Mok & James X. Sullivan, 2015. "Household Surveys in Crisis," Journal of Economic Perspectives, American Economic Association, vol. 29(4), pages 199-226, Fall.
    10. repec:mpr:mprres:4761 is not listed on IDEAS
    11. James J. Heckman & Edward Vytlacil, 2005. "Structural Equations, Treatment Effects, and Econometric Policy Evaluation," Econometrica, Econometric Society, vol. 73(3), pages 669-738, May.
    12. Steiner Peter M. & Kim Yongnam, 2016. "The Mechanics of Omitted Variable Bias: Bias Amplification and Cancellation of Offsetting Biases," Journal of Causal Inference, De Gruyter, vol. 4(2), pages 1-22, September.
    13. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    14. Raj Chetty & John N. Friedman & Nathaniel Hilger & Emmanuel Saez & Diane Whitmore Schanzenbach & Danny Yagan, 2011. "How Does Your Kindergarten Classroom Affect Your Earnings? Evidence from Project Star," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 126(4), pages 1593-1660.
    15. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    16. Christian N. Brinch & Magne Mogstad & Matthew Wiswall, 2017. "Beyond LATE with a Discrete Instrument," Journal of Political Economy, University of Chicago Press, vol. 125(4), pages 985-1039.
    17. Marinho Bertanha & Guido W. Imbens, 2020. "External Validity in Fuzzy Regression Discontinuity Designs," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(3), pages 593-612, July.
    18. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    19. Philip Oreopoulos & Daniel Lang & Joshua Angrist, 2009. "Incentives and Services for College Achievement: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 136-163, January.
    20. Ernst Fehr & Lorenz Goette, 2007. "Do Workers Work More if Wages Are High? Evidence from a Randomized Field Experiment," American Economic Review, American Economic Association, vol. 97(1), pages 298-317, March.
    21. Amy Finkelstein & Sarah Taubman & Bill Wright & Mira Bernstein & Jonathan Gruber & Joseph P. Newhouse & Heidi Allen & Katherine Baicker, 2012. "The Oregon Health Insurance Experiment: Evidence from the First Year," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(3), pages 1057-1106.
    22. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    23. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    24. Huber, Martin, 2013. "A simple test for the ignorability of non-compliance in experiments," Economics Letters, Elsevier, vol. 120(3), pages 389-391.
    25. Paloyo, Alfredo R. & Rogan, Sally & Siminski, Peter, 2016. "The effect of supplemental instruction on academic performance: An encouragement design experiment," Economics of Education Review, Elsevier, vol. 55(C), pages 57-69.
    26. Flavio Cunha & James J. Heckman & Susanne M. Schennach, 2010. "Estimating the Technology of Cognitive and Noncognitive Skill Formation," Econometrica, Econometric Society, vol. 78(3), pages 883-931, May.
    27. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    28. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    29. Jeremy Lise & Shannon Seitz & Jeffrey Smith, 2015. "Evaluating search and matching models using experimental data," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-35, December.
    30. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    31. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 202010, University of California at Riverside, Department of Economics, revised Mar 2020.
    32. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    33. Sebastian Galiani & Patrick J. McEwan & Brian Quistorff, 2017. "External and Internal Validity of a Geographic Quasi-Experiment Embedded in a Cluster-Randomized Experiment," Advances in Econometrics, in: Regression Discontinuity Designs, volume 38, pages 195-236, Emerald Group Publishing Limited.
    34. Dean Karlan & Jonathan Zinman, 2009. "Observing Unobservables: Identifying Information Asymmetries With a Consumer Credit Field Experiment," Econometrica, Econometric Society, vol. 77(6), pages 1993-2008, November.
    35. David S. Lee, 2009. "Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 76(3), pages 1071-1102.
    36. Krueger, Alan B & Whitmore, Diane M, 2001. "The Effect of Attending a Small Class in the Early Grades on College-Test Taking and Middle School Test Results: Evidence from Project STAR," Economic Journal, Royal Economic Society, vol. 111(468), pages 1-28, January.
    37. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    38. Erin Hartman & Richard Grieve & Roland Ramsahai & Jasjeet S. Sekhon, 2015. "From sample average treatment effect to population average treatment effect on the treated: combining experimental with observational studies to estimate population treatment effects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 178(3), pages 757-778, June.
    39. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Theory," Econometrica, Econometric Society, vol. 52(3), pages 681-700, May.
    40. Hausman, Jerry A & Wise, David A, 1979. "Attrition Bias in Experimental and Panel Data: The Gary Income Maintenance Experiment," Econometrica, Econometric Society, vol. 47(2), pages 455-473, March.
    41. Eric Chyn, 2018. "Moved to Opportunity: The Long-Run Effects of Public Housing Demolition on Children," American Economic Review, American Economic Association, vol. 108(10), pages 3028-3056, October.
    42. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009 is not listed on IDEAS
    43. Gourieroux, Christian & Monfort, Alain & Trognon, Alain, 1984. "Pseudo Maximum Likelihood Methods: Applications to Poisson Models," Econometrica, Econometric Society, vol. 52(3), pages 701-720, May.
    44. repec:mpr:mprres:3694 is not listed on IDEAS
    45. Steiner Peter M. & Kim Yongnam, 2016. "The Mechanics of Omitted Variable Bias: Bias Amplification and Cancellation of Offsetting Biases," Journal of Causal Inference, De Gruyter, vol. 4(2), pages 1, September.
    46. Sebastian Calónico & Jeffrey Smith, 2017. "The Women of the National Supported Work Demonstration," Journal of Labor Economics, University of Chicago Press, vol. 35(S1), pages 65-97.
    47. Christopher R. Walters, 2018. "The Demand for Effective Charter Schools," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2179-2223.
    48. Russell, Lauren, 2017. "Can learning communities boost success of women and minorities in STEM? Evidence from the Massachusetts Institute of Technology," Economics of Education Review, Elsevier, vol. 61(C), pages 98-111.
    49. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009.pdf is not listed on IDEAS
    50. Raj Chetty & Nathaniel Hendren & Lawrence F. Katz, 2016. "The Effects of Exposure to Better Neighborhoods on Children: New Evidence from the Moving to Opportunity Experiment," American Economic Review, American Economic Association, vol. 106(4), pages 855-902, April.
    51. Julie Hotchkiss & Robert Moore & M. Melinda Pitts, 2006. "Freshman Learning Communities, College Performance, and Retention," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 197-210.
    52. Onur Altindag & Theodore J. Joyce & Julie A. Reeder, 2015. "Effects of Peer Counseling to Support Breastfeeding: Assessing the External Validity of a Randomized Field Experiment," NBER Working Papers 21013, National Bureau of Economic Research, Inc.
    53. Amanda E. Kowalski, 2018. "How to Examine External Validity Within an Experiment," NBER Working Papers 24834, National Bureau of Economic Research, Inc.
    54. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    55. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, September.
    56. Wooldridge, Jeffrey M., 2014. "Quasi-maximum likelihood estimation and testing for nonlinear models with endogenous explanatory variables," Journal of Econometrics, Elsevier, vol. 182(1), pages 226-234.
    57. Edward Vytlacil, 2002. "Independence, Monotonicity, and Latent Index Models: An Equivalence Result," Econometrica, Econometric Society, vol. 70(1), pages 331-341, January.
    58. repec:pri:rpdevs:instruments_of_development.pdf is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing Attrition Bias in Field Experiments," Working Papers 202218, University of California at Riverside, Department of Economics, revised Oct 2022.
    2. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    3. Dalia Ghanem & Sarojini Hirshleifer & Karen Ortiz-Becerra, 2019. "Testing for Attrition Bias in Field Experiments," Working Papers 202010, University of California at Riverside, Department of Economics, revised Mar 2020.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Azzam, Tarek & Bates, Michael D. & Fairris, David, 2022. "Do learning communities increase first year college retention? Evidence from a randomized control trial," Economics of Education Review, Elsevier, vol. 89(C).
    2. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    3. Possebom, Vitor, 2018. "Sharp bounds on the MTE with sample selection," MPRA Paper 89785, University Library of Munich, Germany.
    4. Patrick Kline & Christopher R. Walters, 2019. "On Heckits, LATE, and Numerical Equivalence," Econometrica, Econometric Society, vol. 87(2), pages 677-696, March.
    5. Bartalotti, Otávio & Kédagni, Désiré & Possebom, Vitor, 2023. "Identifying marginal treatment effects in the presence of sample selection," Journal of Econometrics, Elsevier, vol. 234(2), pages 565-584.
    6. Huber, Martin & Wüthrich, Kaspar, 2017. "Evaluating local average and quantile treatment effects under endogeneity based on instruments: a review," FSES Working Papers 479, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    7. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    8. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    9. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    10. Akabayashi, Hideo & Ruberg, Tim & Shikishima, Chizuru & Yamashita, Jun, 2023. "Education-oriented and care-oriented preschools: Implications on child development," Labour Economics, Elsevier, vol. 84(C).
    11. Amanda E Kowalski, 2023. "Behaviour within a Clinical Trial and Implications for Mammography Guidelines," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 90(1), pages 432-462.
    12. Dionissi Aliprantis, 2017. "Assessing the evidence on neighborhood effects from Moving to Opportunity," Empirical Economics, Springer, vol. 52(3), pages 925-954, May.
    13. Huber Martin & Wüthrich Kaspar, 2019. "Local Average and Quantile Treatment Effects Under Endogeneity: A Review," Journal of Econometric Methods, De Gruyter, vol. 8(1), pages 1-27, January.
    14. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    15. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    16. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    17. Cornelissen, Thomas & Dustmann, Christian & Raute, Anna & Schönberg, Uta, 2016. "From LATE to MTE: Alternative methods for the evaluation of policy interventions," Labour Economics, Elsevier, vol. 41(C), pages 47-60.
    18. Magne Mogstad & Andres Santos & Alexander Torgovitsky, 2018. "Using Instrumental Variables for Inference About Policy Relevant Treatment Parameters," Econometrica, Econometric Society, vol. 86(5), pages 1589-1619, September.
    19. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(4), pages 1795-1848.
    20. Sebastian Galiani & Juan Pantano, 2021. "Structural Models: Inception and Frontier," NBER Working Papers 28698, National Bureau of Economic Research, Inc.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucr:wpaper:202022. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kelvin Mac (email available below). General contact details of provider: https://edirc.repec.org/data/deucrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.