IDEAS home Printed from https://ideas.repec.org/a/sae/jedbes/v48y2023i4p420-453.html
   My bibliography  Save this article

Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation

Author

Listed:
  • Benjamin Lu

    (University of California, Berkeley)

  • Eli Ben-Michael

    (Carnegie Mellon University)

  • Avi Feller

    (University of California, Berkeley)

  • Luke Miratrix

    (Harvard University)

Abstract

In multisite trials, learning about treatment effect variation across sites is critical for understanding where and for whom a program works. Unadjusted comparisons, however, capture “compositional†differences in the distributions of unit-level features as well as “contextual†differences in site-level features, including possible differences in program implementation. Our goal in this article is to adjust site-level estimates for differences in the distribution of observed unit-level features: If we can reweight (or “transport†) each site to have a common distribution of observed unit-level covariates, the remaining treatment effect variation captures contextual and unobserved compositional differences across sites. This allows us to make apples-to-apples comparisons across sites, parceling out the amount of cross-site effect variation explained by systematic differences in populations served. In this article, we develop a framework for transporting effects using approximate balancing weights, where the weights are chosen to directly optimize unit-level covariate balance between each site and the common target distribution. We first develop our approach for the general setting of transporting the effect of a single-site trial. We then extend our method to multisite trials, assess its performance via simulation, and use it to analyze a series of multisite trials of adult education and vocational training programs. In our application, we find that distributional differences are potentially masking cross-site variation. Our method is available in the balancer R package.

Suggested Citation

  • Benjamin Lu & Eli Ben-Michael & Avi Feller & Luke Miratrix, 2023. "Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation," Journal of Educational and Behavioral Statistics, , vol. 48(4), pages 420-453, August.
  • Handle: RePEc:sae:jedbes:v:48:y:2023:i:4:p:420-453
    DOI: 10.3102/10769986231155427
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.3102/10769986231155427
    Download Restriction: no

    File URL: https://libkey.io/10.3102/10769986231155427?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Hainmueller, Jens, 2012. "Entropy Balancing for Causal Effects: A Multivariate Reweighting Method to Produce Balanced Samples in Observational Studies," Political Analysis, Cambridge University Press, vol. 20(1), pages 25-46, January.
    3. Eli Ben-Michael & Avi Feller & Jesse Rothstein, 2021. "The Augmented Synthetic Control Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 116(536), pages 1789-1803, October.
    4. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
    5. S Yang & P Ding, 2018. "Asymptotic inference of causal effects with observational studies trimmed by the estimated propensity scores," Biometrika, Biometrika Trust, vol. 105(2), pages 487-493.
    6. Naoki Egami & Erin Hartman, 2021. "Covariate selection for generalizing experimental results: Application to a large‐scale development program in Uganda," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(4), pages 1524-1548, October.
    7. Christopher R. Walters, 2015. "Inputs in the Production of Early Childhood Human Capital: Evidence from Head Start," American Economic Journal: Applied Economics, American Economic Association, vol. 7(4), pages 76-102, October.
    8. Guido W. Imbens, 2015. "Matching Methods in Practice: Three Examples," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 373-419.
    9. Jesse Rothstein, 2010. "Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement," The Quarterly Journal of Economics, Oxford University Press, vol. 125(1), pages 175-214.
    10. Kara E. Rudolph & Mark J. Laan, 2017. "Robust estimation of encouragement design intervention effects transported across sites," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(5), pages 1509-1525, November.
    11. José R. Zubizarreta, 2015. "Stable Weights that Balance Covariates for Estimation With Incomplete Outcome Data," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(511), pages 910-922, September.
    12. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    13. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    14. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    15. Xinwei Ma & Jingshen Wang, 2020. "Robust Inference Using Inverse Probability Weighting," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(532), pages 1851-1860, December.
    16. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    17. Howard S. Bloom & Rebecca Unterman & Pei Zhu & Sean F. Reardon, 2020. "Lessons from New York City's Small Schools of Choice about High School Features that Promote Graduation for Disadvantaged Students," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(3), pages 740-771, June.
    18. D’Amour, Alexander & Ding, Peng & Feller, Avi & Lei, Lihua & Sekhon, Jasjeet, 2021. "Overlap in observational studies with high-dimensional covariates," Journal of Econometrics, Elsevier, vol. 221(2), pages 644-654.
    19. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    20. Dehejia, Rajeev H, 2003. "Was There a Riverside Miracle? A Hierarchical Framework for Evaluating Programs with Grouped Data," Journal of Business & Economic Statistics, American Statistical Association, vol. 21(1), pages 1-11, January.
    21. Howard S. Bloom & Carolyn J. Hill & James A. Riccio, 2003. "Linking program implementation and effectiveness: Lessons from a pooled sample of welfare-to-work experiments," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 22(4), pages 551-575.
    22. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    23. Heller, Ruth & Rosenbaum, Paul R. & Small, Dylan S., 2009. "Split Samples and Design Sensitivity in Observational Studies," Journal of the American Statistical Association, American Statistical Association, vol. 104(487), pages 1090-1101.
    24. Peng Ding & Avi Feller & Luke Miratrix, 2019. "Decomposing Treatment Effect Variation," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(525), pages 304-317, January.
    25. Issa J. Dahabreh & Sarah E. Robertson & Eric J. Tchetgen & Elizabeth A. Stuart & Miguel A. Hernán, 2019. "Generalizing causal inferences from individuals in randomized trials to all trial‐eligible individuals," Biometrics, The International Biometric Society, vol. 75(2), pages 685-694, June.
    26. Xinkun Nie & Guido Imbens & Stefan Wager, 2021. "Covariate Balancing Sensitivity Analysis for Extrapolating Randomized Trials across Locations," Papers 2112.04723, arXiv.org.
    27. King, Gary & Zeng, Langche, 2006. "The Dangers of Extreme Counterfactuals," Political Analysis, Cambridge University Press, vol. 14(2), pages 131-159, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ganesh Karapakula, 2023. "Stable Probability Weighting: Large-Sample and Finite-Sample Estimation and Inference Methods for Heterogeneous Causal Effects of Multivalued Treatments Under Limited Overlap," Papers 2301.05703, arXiv.org, revised Jan 2023.
    2. Dasom Lee & Shu Yang & Lin Dong & Xiaofei Wang & Donglin Zeng & Jianwen Cai, 2023. "Improving trial generalizability using observational studies," Biometrics, The International Biometric Society, vol. 79(2), pages 1213-1225, June.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    5. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    6. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    7. Jason J. Sauppe & Sheldon H. Jacobson, 2017. "The role of covariate balance in observational studies," Naval Research Logistics (NRL), John Wiley & Sons, vol. 64(4), pages 323-344, June.
    8. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    9. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    10. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    11. Dennis Shen & Peng Ding & Jasjeet Sekhon & Bin Yu, 2022. "Same Root Different Leaves: Time Series and Cross-Sectional Methods in Panel Data," Papers 2207.14481, arXiv.org, revised Oct 2022.
    12. Michael C. Knaus, 2021. "A double machine learning approach to estimate the effects of musical practice on student’s skills," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 282-300, January.
    13. Dana Rotz & Brian Goesling & Molly Crofton & Jennifer Manlove & Kate Welti, "undated". "Final Impacts of Teen PEP (Teen Prevention Education Program) in New Jersey and North Carolina High Schools," Mathematica Policy Research Reports 40fc2bb74d874e59a8e424638, Mathematica Policy Research.
    14. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Guido W. Imbens, 2022. "Causality in Econometrics: Choice vs Chance," Econometrica, Econometric Society, vol. 90(6), pages 2541-2566, November.
    17. Adesina, Adedoyin & Akogun, Oladele & Dillon, Andrew & Friedman, Jed & Njobdi, Sani & Serneels, Pieter, 2017. "Robustness and External Validity: What do we Learn from Repeated Study Designs over Time?," 2018 Allied Social Sciences Association (ASSA) Annual Meeting, January 5-7, 2018, Philadelphia, Pennsylvania 266292, Agricultural and Applied Economics Association.
    18. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    19. Dehejia Rajeev, 2015. "Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 47-69, June.
    20. Rahul Singh & Liyuan Xu & Arthur Gretton, 2020. "Kernel Methods for Causal Functions: Dose, Heterogeneous, and Incremental Response Curves," Papers 2010.04855, arXiv.org, revised Oct 2022.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:jedbes:v:48:y:2023:i:4:p:420-453. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.