IDEAS home Printed from https://ideas.repec.org/a/bla/biomet/v74y2018i4p1161-1170.html
   My bibliography  Save this article

A powerful approach to the study of moderate effect modification in observational studies

Author

Listed:
  • Kwonsang Lee
  • Dylan S. Small
  • Paul R. Rosenbaum

Abstract

Effect modification means the magnitude or stability of a treatment effect varies as a function of an observed covariate. Generally, larger and more stable treatment effects are insensitive to larger biases from unmeasured covariates, so a causal conclusion may be considerably firmer if this pattern is noted if it occurs. We propose a new strategy, called the submax‐method, that combines exploratory, and confirmatory efforts to determine whether there is stronger evidence of causality—that is, greater insensitivity to unmeasured confounding—in some subgroups of individuals. It uses the joint distribution of test statistics that split the data in various ways based on certain observed covariates. For L binary covariates, the method splits the population L times into two subpopulations, perhaps first men and women, perhaps then smokers and nonsmokers, computing a test statistic from each subpopulation, and appends the test statistic for the whole population, making 2L+1 test statistics in total. Although L binary covariates define 2L interaction groups, only 2L+1 tests are performed, and at least L+1 of these tests use at least half of the data. The submax‐method achieves the highest design sensitivity and the highest Bahadur efficiency of its component tests. Moreover, the form of the test is sufficiently tractable that its large sample power may be studied analytically. The simulation suggests that the submax method exhibits superior performance, in comparison with an approach using CART, when there is effect modification of moderate size. Using data from the NHANES I epidemiologic follow‐up survey, an observational study of the effects of physical activity on survival is used to illustrate the method. The method is implemented in the R package submax which contains the NHANES example. An online Appendix provides simulation results and further analysis of the example.

Suggested Citation

  • Kwonsang Lee & Dylan S. Small & Paul R. Rosenbaum, 2018. "A powerful approach to the study of moderate effect modification in observational studies," Biometrics, The International Biometric Society, vol. 74(4), pages 1161-1170, December.
  • Handle: RePEc:bla:biomet:v:74:y:2018:i:4:p:1161-1170
    DOI: 10.1111/biom.12884
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/biom.12884
    Download Restriction: no

    File URL: https://libkey.io/10.1111/biom.12884?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jesse Y. Hsu & José R. Zubizarreta & Dylan S. Small & Paul R. Rosenbaum, 2015. "Strong control of the familywise error rate in observational studies that discover effect modification by exploratory methods," Biometrika, Biometrika Trust, vol. 102(4), pages 767-782.
    2. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    3. Peter B. Gilbert & Ronald J. Bosch & Michael G. Hudgens, 2003. "Sensitivity Analysis for the Assessment of Causal Vaccine Effects on Viral Load in HIV Vaccine Trials," Biometrics, The International Biometric Society, vol. 59(3), pages 531-541, September.
    4. Paul R. Rosenbaum, 2004. "Design sensitivity in observational studies," Biometrika, Biometrika Trust, vol. 91(1), pages 153-164, March.
    5. Jesse Y. Hsu & Dylan S. Small & Paul R. Rosenbaum, 2013. "Effect Modification and Design Sensitivity in Observational Studies," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(501), pages 135-148, March.
    6. Paul R. Rosenbaum, 2007. "Sensitivity Analysis for m-Estimates, Tests, and Confidence Intervals in Matched Observational Studies," Biometrics, The International Biometric Society, vol. 63(2), pages 456-464, June.
    7. Rosenbaum, Paul R. & Silber, Jeffrey H., 2009. "Sensitivity Analysis for Equivalence and Difference in an Observational Study of Neonatal Intensive Care Units," Journal of the American Statistical Association, American Statistical Association, vol. 104(486), pages 501-511.
    8. Paul R. Rosenbaum, 2015. "Bahadur Efficiency of Sensitivity Analyses in Observational Studies," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 205-217, March.
    9. Goeman Jelle J. & Finos Livio, 2012. "The Inheritance Procedure: Multiple Testing of Tree-structured Hypotheses," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 11(1), pages 1-18, January.
    10. Joseph L. Gastwirth & Abba M. Krieger & Paul R. Rosenbaum, 2000. "Asymptotic separability in sensitivity analysis," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 62(3), pages 545-555.
    11. Colin B. Fogarty & Dylan S. Small, 2016. "Sensitivity Analysis for Multiple Comparisons in Matched Observational Studies Through Quadratically Constrained Linear Programming," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1820-1830, October.
    12. P. R. Rosenbaum, 2012. "Testing one hypothesis twice in observational studies," Biometrika, Biometrika Trust, vol. 99(4), pages 763-774.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Paul R. Rosenbaum, 2015. "Bahadur Efficiency of Sensitivity Analyses in Observational Studies," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 205-217, March.
    2. Siyu Heng & Hyunseung Kang & Dylan S. Small & Colin B. Fogarty, 2021. "Increasing power for observational studies of aberrant response: An adaptive approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 482-504, July.
    3. Paul R. Rosenbaum, 2023. "A second evidence factor for a second control group," Biometrics, The International Biometric Society, vol. 79(4), pages 3968-3980, December.
    4. Paul R. Rosenbaum, 2013. "Impact of Multiple Matched Controls on Design Sensitivity in Observational Studies," Biometrics, The International Biometric Society, vol. 69(1), pages 118-127, March.
    5. Paul R. Rosenbaum, 2015. "Some Counterclaims Undermine Themselves in Observational Studies," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1389-1398, December.
    6. Siyu Heng & Dylan S. Small & Paul R. Rosenbaum, 2020. "Finding the strength in a weak instrument in a study of cognitive outcomes produced by Catholic high schools," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(3), pages 935-958, June.
    7. Paul R. Rosenbaum, 2023. "Sensitivity analyses informed by tests for bias in observational studies," Biometrics, The International Biometric Society, vol. 79(1), pages 475-487, March.
    8. Paul R. Rosenbaum & Dylan S. Small, 2017. "An adaptive Mantel–Haenszel test for sensitivity analysis in observational studies," Biometrics, The International Biometric Society, vol. 73(2), pages 422-430, June.
    9. Samuel D. Pimentel & Dylan S. Small & Paul R. Rosenbaum, 2016. "Constructed Second Control Groups and Attenuation of Unmeasured Biases," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1157-1167, July.
    10. Ruoqi Yu, 2021. "Evaluating and improving a matched comparison of antidepressants and bone density," Biometrics, The International Biometric Society, vol. 77(4), pages 1276-1288, December.
    11. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    12. William Arbour, 2021. "Can Recidivism be Prevented from Behind Bars? Evidence from a Behavioral Program," Working Papers tecipa-683, University of Toronto, Department of Economics.
    13. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    14. Dimitris Bertsimas & Agni Orfanoudaki & Rory B. Weiner, 2020. "Personalized treatment for coronary artery disease patients: a machine learning approach," Health Care Management Science, Springer, vol. 23(4), pages 482-506, December.
    15. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.
    16. Kyle Colangelo & Ying-Ying Lee, 2019. "Double debiased machine learning nonparametric inference with continuous treatments," CeMMAP working papers CWP72/19, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    17. Shonosuke Sugasawa & Hisashi Noma, 2021. "Efficient screening of predictive biomarkers for individual treatment selection," Biometrics, The International Biometric Society, vol. 77(1), pages 249-257, March.
    18. Ruoxuan Xiong & Allison Koenecke & Michael Powell & Zhu Shen & Joshua T. Vogelstein & Susan Athey, 2021. "Federated Causal Inference in Heterogeneous Observational Data," Papers 2107.11732, arXiv.org, revised Apr 2023.
    19. Stephen Jarvis & Olivier Deschenes & Akshaya Jha, 2022. "The Private and External Costs of Germany’s Nuclear Phase-Out," Journal of the European Economic Association, European Economic Association, vol. 20(3), pages 1311-1346.
    20. Hayakawa, Kazunobu & Keola, Souknilanh & Silaphet, Korrakoun & Yamanouchi, Kenta, 2022. "Estimating the impacts of international bridges on foreign firm locations: a machine learning approach," IDE Discussion Papers 847, Institute of Developing Economies, Japan External Trade Organization(JETRO).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:biomet:v:74:y:2018:i:4:p:1161-1170. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0006-341X .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.