IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0208795.html
   My bibliography  Save this article

Sensitivity analyses for effect modifiers not observed in the target population when generalizing treatment effects from a randomized controlled trial: Assumptions, models, effect scales, data scenarios, and implementation details

Author

Listed:
  • Trang Quynh Nguyen
  • Benjamin Ackerman
  • Ian Schmid
  • Stephen R Cole
  • Elizabeth A Stuart

Abstract

Background: Randomized controlled trials are often used to inform policy and practice for broad populations. The average treatment effect (ATE) for a target population, however, may be different from the ATE observed in a trial if there are effect modifiers whose distribution in the target population is different that from that in the trial. Methods exist to use trial data to estimate the target population ATE, provided the distributions of treatment effect modifiers are observed in both the trial and target population—an assumption that may not hold in practice. Methods: The proposed sensitivity analyses address the situation where a treatment effect modifier is observed in the trial but not the target population. These methods are based on an outcome model or the combination of such a model and weighting adjustment for observed differences between the trial sample and target population. They accommodate several types of outcome models: linear models (including single time outcome and pre- and post-treatment outcomes) for additive effects, and models with log or logit link for multiplicative effects. We clarify the methods’ assumptions and provide detailed implementation instructions. Illustration: We illustrate the methods using an example generalizing the effects of an HIV treatment regimen from a randomized trial to a relevant target population. Conclusion: These methods allow researchers and decision-makers to have more appropriate confidence when drawing conclusions about target population effects.

Suggested Citation

  • Trang Quynh Nguyen & Benjamin Ackerman & Ian Schmid & Stephen R Cole & Elizabeth A Stuart, 2018. "Sensitivity analyses for effect modifiers not observed in the target population when generalizing treatment effects from a randomized controlled trial: Assumptions, models, effect scales, data scenari," PLOS ONE, Public Library of Science, vol. 13(12), pages 1-17, December.
  • Handle: RePEc:plo:pone00:0208795
    DOI: 10.1371/journal.pone.0208795
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0208795
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0208795&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0208795?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    2. Westreich, D. & Edwards, J.K. & Rogawski, E.T. & Hudgens, M.G. & Stuart, E.A. & Cole, S.R., 2016. "Causal impact: Epidemiological approaches for a public health of consequence," American Journal of Public Health, American Public Health Association, vol. 106(6), pages 1011-1012.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Benjamin Lu & Eli Ben-Michael & Avi Feller & Luke Miratrix, 2023. "Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation," Journal of Educational and Behavioral Statistics, , vol. 48(4), pages 420-453, August.
    2. Elizabeth Tipton & Robert B. Olsen, "undated". "Enhancing the Generalizability of Impact Studies in Education," Mathematica Policy Research Reports 35d5625333dc480aba9765b3b, Mathematica Policy Research.
    3. Xu Qin & Jonah Deutsch & Guanglei Hong, 2021. "Unpacking Complex Mediation Mechanisms And Their Heterogeneity Between Sites In A Job Corps Evaluation," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(1), pages 158-190, January.
    4. Ronald Herrera & Ursula Berger & Ondine S. Von Ehrenstein & Iván Díaz & Stella Huber & Daniel Moraga Muñoz & Katja Radon, 2017. "Estimating the Causal Impact of Proximity to Gold and Copper Mines on Respiratory Diseases in Chilean Children: An Application of Targeted Maximum Likelihood Estimation," IJERPH, MDPI, vol. 15(1), pages 1-15, December.
    5. Christopher Rhoads, 2017. "Coherent Power Analysis in Multilevel Studies Using Parameters From Surveys," Journal of Educational and Behavioral Statistics, , vol. 42(2), pages 166-194, April.
    6. Buhl-Wiggers, Julie & Kerwin, Jason & Muñoz-Morales, Juan S. & Smith, Jeffrey A. & Thornton, Rebecca L., 2020. "Some Children Left Behind: Variation in the Effects of an Educational Intervention," IZA Discussion Papers 13598, Institute of Labor Economics (IZA).
    7. Adam Gamoran, 2018. "Evidence-Based Policy in the Real World: A Cautionary View," The ANNALS of the American Academy of Political and Social Science, , vol. 678(1), pages 180-191, July.
    8. Sarah Dolfin & Nan Maxwell & Ankita Patnaik, "undated". "WHD Compliance Strategies: Directions for Future Research," Mathematica Policy Research Reports b7a5ca876e0b448f9b9c0850e, Mathematica Policy Research.
    9. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    10. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    11. Robert C. Granger, 2018. "The Roles Foundations Are Playing in the Evidence-Based Policy Movement," The ANNALS of the American Academy of Political and Social Science, , vol. 678(1), pages 145-154, July.
    12. Esterling, Kevin M. & Brady, David & Schwitzgebel, Eric, 2023. "The Necessity of Construct and External Validity for Generalized Causal Claims," I4R Discussion Paper Series 18, The Institute for Replication (I4R).
    13. Rafael Quintana, 2023. "Embracing complexity in social science research," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 15-38, February.
    14. Robert Ammerman & Anne Duggan & John List & Lauren Supplee & Dana Suskind, 2021. "The role of open science practices in scaling evidence-based prevention programs," Natural Field Experiments 00741, The Field Experiments Website.
    15. Nianbo Dong & Benjamin M. Kelcey, 2020. "A Review of Causality in a Social World: Moderation, Mediation, and Spill-Over," Journal of Educational and Behavioral Statistics, , vol. 45(3), pages 374-378, June.
    16. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    17. Philip M. Gleason, "undated". "What's the Secret Ingredient? Searching for Policies and Practices that Make Charter Schools Successful," Mathematica Policy Research Reports eea6e24d9bf1409f92f60ae29, Mathematica Policy Research.
    18. Xu Qin & Guanglei Hong, 2017. "A Weighting Method for Assessing Between-Site Heterogeneity in Causal Mediation Mechanism," Journal of Educational and Behavioral Statistics, , vol. 42(3), pages 308-340, June.
    19. Jason Shumberger & Akheil Singla, 2022. "Are tax and expenditure limitations constraining institutions or institutionally irrelevant? Evidence from Minnesota," Public Budgeting & Finance, Wiley Blackwell, vol. 42(4), pages 3-33, December.
    20. Howard S. Bloom & Rebecca Unterman & Pei Zhu & Sean F. Reardon, 2020. "Lessons from New York City's Small Schools of Choice about High School Features that Promote Graduation for Disadvantaged Students," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(3), pages 740-771, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0208795. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.