IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v40y2016i5p410-443.html
   My bibliography  Save this article

An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials

Author

Listed:
  • Andrew P. Jaciw
  • Li Lin
  • Boya Ma

Abstract

Background: Prior research has investigated design parameters for assessing average program impacts on achievement outcomes with cluster randomized trials (CRTs). Less is known about parameters important for assessing differential impacts. Objectives: This article develops a statistical framework for designing CRTs to assess differences in impact among student subgroups and presents initial estimates of critical parameters. Research design: Effect sizes and minimum detectable effect sizes for average and differential impacts are calculated before and after conditioning on effects of covariates using results from several CRTs. Relative sensitivities to detect average and differential impacts are also examined. Subjects: Student outcomes from six CRTs are analyzed. Measures: Achievement in math, science, reading, and writing. Results: The ratio of between-cluster variation in the slope of the moderator divided by total variance—the “moderator gap variance ratio†—is important for designing studies to detect differences in impact between student subgroups. This quantity is the analogue of the intraclass correlation coefficient. Typical values were .02 for gender and .04 for socioeconomic status. For studies considered, in many cases estimates of differential impact were larger than of average impact, and after conditioning on effects of covariates, similar power was achieved for detecting average and differential impacts of the same size. Conclusions: Measuring differential impacts is important for addressing questions of equity, generalizability, and guiding interpretation of subgroup impact findings. Adequate power for doing this is in some cases reachable with CRTs designed to measure average impacts. Continuing collection of parameters for assessing differential impacts is the next step.

Suggested Citation

  • Andrew P. Jaciw & Li Lin & Boya Ma, 2016. "An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials," Evaluation Review, , vol. 40(5), pages 410-443, October.
  • Handle: RePEc:sae:evarev:v:40:y:2016:i:5:p:410-443
    DOI: 10.1177/0193841X16659600
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X16659600
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X16659600?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Gelman, Andrew & Stern, Hal, 2006. "The Difference Between," The American Statistician, American Statistical Association, vol. 60, pages 328-331, November.
    2. repec:mpr:mprres:6585 is not listed on IDEAS
    3. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. World Bank, 2017. "Pre-Primary Education in Mongolia," World Bank Publications - Reports 26402, The World Bank Group.
    2. Howley, P.; & Moro, M.; & Waqas, M.; & Delaney, L.; & Heron, T.;, 2018. "Immigration and self-reported well-being in the UK," Health, Econometrics and Data Group (HEDG) Working Papers 18/12, HEDG, c/o Department of Economics, University of York.
    3. Diana M. Hechavarría & Steven A. Brieger, 2022. "Practice rather than preach: cultural practices and female social entrepreneurship," Small Business Economics, Springer, vol. 58(2), pages 1131-1151, February.
    4. Thomas Neise & Franziska Sohns & Moritz Breul & Javier Revilla Diez, 2022. "The effect of natural disasters on FDI attraction: a sector-based analysis over time and space," Natural Hazards: Journal of the International Society for the Prevention and Mitigation of Natural Hazards, Springer;International Society for the Prevention and Mitigation of Natural Hazards, vol. 110(2), pages 999-1023, January.
    5. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    6. repec:mpr:mprres:6286 is not listed on IDEAS
    7. Asongu, Simplice & Odhiambo, Nicholas, 2020. "The role of governance in quality education in sub-Saharan Africa," MPRA Paper 107497, University Library of Munich, Germany.
    8. Elizabeth Tipton & Robert B. Olsen, "undated". "Enhancing the Generalizability of Impact Studies in Education," Mathematica Policy Research Reports 35d5625333dc480aba9765b3b, Mathematica Policy Research.
    9. Ruomeng Cui & Jun Li & Dennis J. Zhang, 2020. "Reducing Discrimination with Reviews in the Sharing Economy: Evidence from Field Experiments on Airbnb," Management Science, INFORMS, vol. 66(3), pages 1071-1094, March.
    10. Lukas Haffert, 2019. "War mobilization or war destruction? The unequal rise of progressive taxation revisited," The Review of International Organizations, Springer, vol. 14(1), pages 59-82, March.
    11. Michael A. Allen & Michael E. Flynn & Julie VanDusky-Allen, 2017. "Regions of Hierarchy and Security: US Troop Deployments, Spatial Relations, and Defense Burdens," International Interactions, Taylor & Francis Journals, vol. 43(3), pages 397-423, May.
    12. Duxbury, Scott W, 2019. "Mediation and Moderation in Statistical Network Models," SocArXiv 9bs4u, Center for Open Science.
    13. Robin Jacob & Marie-Andree Somers & Pei Zhu & Howard Bloom, 2016. "The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions," Evaluation Review, , vol. 40(3), pages 167-198, June.
    14. Christopher Rhoads, 2017. "Coherent Power Analysis in Multilevel Studies Using Parameters From Surveys," Journal of Educational and Behavioral Statistics, , vol. 42(2), pages 166-194, April.
    15. Rebecca A. Maynard, 2006. "Presidential address: Evidence-based decision making: What will it take for the decision makers to care?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(2), pages 249-265.
    16. Roberta Rutigliano & Gøsta Esping-Andersen, 2018. "Partnership Choice and Childbearing in Norway and Spain," European Journal of Population, Springer;European Association for Population Studies, vol. 34(3), pages 367-386, August.
    17. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    18. Deborah Peikes & Stacy Dale & Eric Lundquist & Janice Genevro & David Meyers, 2011. "Building the Evidence Base for the Medical Home: What Sample and Sample Size Do Studies Need?," Mathematica Policy Research Reports 5814eb8219b24982af7f7536c, Mathematica Policy Research.
    19. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    20. Peter Howley & Muhammad Waqas & Mirko Moro & Liam Delaney & Tony Heron, 2020. "It’s Not All about the Economy Stupid! Immigration and Subjective Well-Being in England," Work, Employment & Society, British Sociological Association, vol. 34(5), pages 919-936, October.
    21. repec:mpr:mprres:6094 is not listed on IDEAS
    22. John Deke, 2016. "Design and Analysis Considerations for Cluster Randomized Controlled Trials That Have a Small Number of Clusters," Evaluation Review, , vol. 40(5), pages 444-486, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:40:y:2016:i:5:p:410-443. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.