IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2112.04723.html
   My bibliography  Save this paper

Covariate Balancing Sensitivity Analysis for Extrapolating Randomized Trials across Locations

Author

Listed:
  • Xinkun Nie
  • Guido Imbens
  • Stefan Wager

Abstract

The ability to generalize experimental results from randomized control trials (RCTs) across locations is crucial for informing policy decisions in targeted regions. Such generalization is often hindered by the lack of identifiability due to unmeasured effect modifiers that compromise direct transport of treatment effect estimates from one location to another. We build upon sensitivity analysis in observational studies and propose an optimization procedure that allows us to get bounds on the treatment effects in targeted regions. Furthermore, we construct more informative bounds by balancing on the moments of covariates. In simulation experiments, we show that the covariate balancing approach is promising in getting sharper identification intervals.

Suggested Citation

  • Xinkun Nie & Guido Imbens & Stefan Wager, 2021. "Covariate Balancing Sensitivity Analysis for Extrapolating Randomized Trials across Locations," Papers 2112.04723, arXiv.org.
  • Handle: RePEc:arx:papers:2112.04723
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2112.04723
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Paul R. Rosenbaum, 2011. "A New u-Statistic with Superior Design Sensitivity in Matched Observational Studies," Biometrics, The International Biometric Society, vol. 67(3), pages 1017-1027, September.
    2. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    3. Robert B. Olsen & Larry L. Orr & Stephen H. Bell & Elizabeth A. Stuart, 2013. "External Validity in Policy Evaluations That Choose Sites Purposively," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(1), pages 107-121, January.
    4. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
    5. Paul R. Rosenbaum, 2014. "Weighted M -statistics With Superior Design Sensitivity in Matched Observational Studies With Multiple Controls," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1145-1158, September.
    6. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    7. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    8. Tan, Zhiqiang, 2006. "A Distributional Approach for Causal Inference Using Propensity Scores," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1619-1637, December.
    9. Hainmueller, Jens, 2012. "Entropy Balancing for Causal Effects: A Multivariate Reweighting Method to Produce Balanced Samples in Observational Studies," Political Analysis, Cambridge University Press, vol. 20(1), pages 25-46, January.
    10. Jelena Bradic & Stefan Wager & Yinchu Zhu, 2019. "Sparsity Double Robust Inference of Average Treatment Effects," Papers 1905.00744, arXiv.org.
    11. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    12. Bryan S. Graham & Cristine Campos de Xavier Pinto & Daniel Egel, 2016. "Efficient Estimation of Data Combination Models by the Method of Auxiliary-to-Study Tilting (AST)," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 34(2), pages 288-301, April.
    13. Qingyuan Zhao & Dylan S. Small & Bhaswar B. Bhattacharya, 2019. "Sensitivity analysis for inverse probability weighting estimators via the percentile bootstrap," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(4), pages 735-761, September.
    14. Kosuke Imai & Marc Ratkovic, 2014. "Covariate balancing propensity score," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(1), pages 243-263, January.
    15. Rosenbaum, Paul R., 2010. "Design Sensitivity and Efficiency in Observational Studies," Journal of the American Statistical Association, American Statistical Association, vol. 105(490), pages 692-702.
    16. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    17. Orazio Attanasio & Adriana Kugler & Costas Meghir, 2011. "Subsidizing Vocational Training for Disadvantaged Youth in Colombia: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 3(3), pages 188-220, July.
    18. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    19. Colin B. Fogarty & Dylan S. Small, 2016. "Sensitivity Analysis for Multiple Comparisons in Matched Observational Studies Through Quadratically Constrained Linear Programming," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1820-1830, October.
    20. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lihua Lei & Roshni Sahoo & Stefan Wager, 2023. "Policy Learning under Biased Sample Selection," Papers 2304.11735, arXiv.org.
    2. Jacob Dorn & Kevin Guo & Nathan Kallus, 2021. "Doubly-Valid/Doubly-Sharp Sensitivity Analysis for Causal Inference with Unmeasured Confounding," Papers 2112.11449, arXiv.org, revised Jul 2022.
    3. Ashesh Rambachan & Amanda Coston & Edward Kennedy, 2022. "Robust Design and Evaluation of Predictive Algorithms under Unobserved Confounding," Papers 2212.09844, arXiv.org, revised Aug 2023.
    4. Colnet Bénédicte & Josse Julie & Varoquaux Gaël & Scornet Erwan, 2022. "Causal effect on a target population: A sensitivity analysis to handle missing covariates," Journal of Causal Inference, De Gruyter, vol. 10(1), pages 372-414, January.
    5. Benjamin Lu & Eli Ben-Michael & Avi Feller & Luke Miratrix, 2023. "Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation," Journal of Educational and Behavioral Statistics, , vol. 48(4), pages 420-453, August.
    6. Apoorva Lal & Wenjing Zheng & Simon Ejdemyr, 2023. "A Framework for Generalization and Transportation of Causal Estimates Under Covariate Shift," Papers 2301.04776, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    2. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    3. Isaiah Andrews & Emily Oster, 2017. "A Simple Approximation for Evaluating External Validity Bias," NBER Working Papers 23826, National Bureau of Economic Research, Inc.
    4. Frederico Finan & Demian Pouzo, 2021. "Reinforcing RCTs with Multiple Priors while Learning about External Validity," Papers 2112.09170, arXiv.org, revised Mar 2023.
    5. Takuya Ishihara & Toru Kitagawa, 2021. "Evidence Aggregation for Treatment Choice," Papers 2108.06473, arXiv.org.
    6. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    7. Dasom Lee & Shu Yang & Lin Dong & Xiaofei Wang & Donglin Zeng & Jianwen Cai, 2023. "Improving trial generalizability using observational studies," Biometrics, The International Biometric Society, vol. 79(2), pages 1213-1225, June.
    8. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    9. Shixiao Zhang & Peisong Han & Changbao Wu, 2023. "Calibration Techniques Encompassing Survey Sampling, Missing Data Analysis and Causal Inference," International Statistical Review, International Statistical Institute, vol. 91(2), pages 165-192, August.
    10. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    11. Siyu Heng & Hyunseung Kang & Dylan S. Small & Colin B. Fogarty, 2021. "Increasing power for observational studies of aberrant response: An adaptive approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 482-504, July.
    12. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    13. Daido Kido, 2022. "Distributionally Robust Policy Learning with Wasserstein Distance," Papers 2205.04637, arXiv.org, revised Aug 2022.
    14. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    15. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    16. Michael C. Knaus, 2021. "A double machine learning approach to estimate the effects of musical practice on student’s skills," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 282-300, January.
    17. Benjamin Lu & Eli Ben-Michael & Avi Feller & Luke Miratrix, 2023. "Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation," Journal of Educational and Behavioral Statistics, , vol. 48(4), pages 420-453, August.
    18. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    19. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    20. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2112.04723. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.