IDEAS home Printed from https://ideas.repec.org/a/inm/oropre/v60y2012i5p1019-1034.html
   My bibliography  Save this article

OR Forum---A POMDP Approach to Personalize Mammography Screening Decisions

Author

Listed:
  • Turgay Ayer

    (H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332)

  • Oguzhan Alagoz

    (Department of Industrial and Systems Engineering, University of Wisconsin, Madison, Madison, Wisconsin 53706)

  • Natasha K. Stout

    (Department of Population Medicine, Harvard Medical School/Harvard Pilgrim Health Care Institute, Boston, Massachusetts 02115)

Abstract

Breast cancer is the most common nonskin cancer and the second leading cause of cancer death in U.S. women. Although mammography is the most effective modality for breast cancer screening, it has several potential risks, including high false-positive rates. Therefore, the balance of benefits and risks, which depend on personal characteristics, is critical in designing a mammography screening schedule. In contrast to prior research and existing guidelines that consider population-based screening recommendations, we propose a personalized mammography screening policy based on the prior screening history and personal risk characteristics of women. We formulate a finite-horizon, partially observable Markov decision process (POMDP) model for this problem. Our POMDP model incorporates two methods of detection (self or screen), age-specific unobservable disease progression, and age-specific mammography test characteristics. We solve this POMDP optimally after setting transition probabilities to values estimated from a validated microsimulation model. Additional published data is used to specify other model inputs such as sensitivity and specificity of test results. Our results show that our proposed personalized screening schedules outperform the existing guidelines with respect to the total expected quality-adjusted life years, while significantly decreasing the number of mammograms and false-positives. We also report the lifetime risk of developing undetected invasive cancer associated with each screening scenario.

Suggested Citation

  • Turgay Ayer & Oguzhan Alagoz & Natasha K. Stout, 2012. "OR Forum---A POMDP Approach to Personalize Mammography Screening Decisions," Operations Research, INFORMS, vol. 60(5), pages 1019-1034, October.
  • Handle: RePEc:inm:oropre:v:60:y:2012:i:5:p:1019-1034
    DOI: 10.1287/opre.1110.1019
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/opre.1110.1019
    Download Restriction: no

    File URL: https://libkey.io/10.1287/opre.1110.1019?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. George E. Monahan, 1982. "State of the Art---A Survey of Partially Observable Markov Decision Processes: Theory, Models, and Algorithms," Management Science, INFORMS, vol. 28(1), pages 1-16, January.
    2. Lisa M. Maillart & Julie Simmons Ivy & Scott Ransom & Kathleen Diehl, 2008. "Assessing Dynamic Breast Cancer Screening Policies," Operations Research, INFORMS, vol. 56(6), pages 1411-1427, December.
    3. Richard D. Smallwood & Edward J. Sondik, 1973. "The Optimal Control of Partially Observable Markov Processes over a Finite Horizon," Operations Research, INFORMS, vol. 21(5), pages 1071-1088, October.
    4. James N. Eagle, 1984. "The Optimal Search for a Moving Target When the Search Path Is Constrained," Operations Research, INFORMS, vol. 32(5), pages 1107-1115, October.
    5. Frank A. Sonnenberg & J. Robert Beck, 1993. "Markov Models in Medical Decision Making," Medical Decision Making, , vol. 13(4), pages 322-338, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Malek Ebadi & Raha Akhavan-Tabatabaei, 2021. "Personalized Cotesting Policies for Cervical Cancer Screening: A POMDP Approach," Mathematics, MDPI, vol. 9(6), pages 1-20, March.
    2. Turgay Ayer & Oguzhan Alagoz & Natasha K. Stout & Elizabeth S. Burnside, 2016. "Heterogeneity in Women’s Adherence and Its Role in Optimal Breast Cancer Screening Policies," Management Science, INFORMS, vol. 62(5), pages 1339-1362, May.
    3. Ali Hajjar & Oguzhan Alagoz, 2023. "Personalized Disease Screening Decisions Considering a Chronic Condition," Management Science, INFORMS, vol. 69(1), pages 260-282, January.
    4. Zong-Zhi Lin & James C. Bean & Chelsea C. White, 2004. "A Hybrid Genetic/Optimization Algorithm for Finite-Horizon, Partially Observed Markov Decision Processes," INFORMS Journal on Computing, INFORMS, vol. 16(1), pages 27-38, February.
    5. Yanling Chang & Alan Erera & Chelsea White, 2015. "A leader–follower partially observed, multiobjective Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 103-128, December.
    6. Jue Wang, 2016. "Minimizing the false alarm rate in systems with transient abnormality," Naval Research Logistics (NRL), John Wiley & Sons, vol. 63(4), pages 320-334, June.
    7. Turgay Ayer, 2015. "Inverse optimization for assessing emerging technologies in breast cancer screening," Annals of Operations Research, Springer, vol. 230(1), pages 57-85, July.
    8. Givon, Moshe & Grosfeld-Nir, Abraham, 2008. "Using partially observed Markov processes to select optimal termination time of TV shows," Omega, Elsevier, vol. 36(3), pages 477-485, June.
    9. Jue Wang & Chi-Guhn Lee, 2015. "Multistate Bayesian Control Chart Over a Finite Horizon," Operations Research, INFORMS, vol. 63(4), pages 949-964, August.
    10. Williams, Byron K., 2009. "Markov decision processes in natural resources management: Observability and uncertainty," Ecological Modelling, Elsevier, vol. 220(6), pages 830-840.
    11. Yanling Chang & Alan Erera & Chelsea White, 2015. "Value of information for a leader–follower partially observed Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 129-153, December.
    12. Chiel van Oosterom & Lisa M. Maillart & Jeffrey P. Kharoufeh, 2017. "Optimal maintenance policies for a safety‐critical system and its deteriorating sensor," Naval Research Logistics (NRL), John Wiley & Sons, vol. 64(5), pages 399-417, August.
    13. Hao Zhang, 2010. "Partially Observable Markov Decision Processes: A Geometric Technique and Analysis," Operations Research, INFORMS, vol. 58(1), pages 214-228, February.
    14. Chernonog, Tatyana & Avinadav, Tal, 2016. "A two-state partially observable Markov decision process with three actionsAuthor-Name: Ben-Zvi, Tal," European Journal of Operational Research, Elsevier, vol. 254(3), pages 957-967.
    15. Serin, Yasemin, 1995. "A nonlinear programming model for partially observable Markov decision processes: Finite horizon case," European Journal of Operational Research, Elsevier, vol. 86(3), pages 549-564, November.
    16. Saghafian, Soroush, 2018. "Ambiguous partially observable Markov decision processes: Structural results and applications," Journal of Economic Theory, Elsevier, vol. 178(C), pages 1-35.
    17. Williams, Byron K., 2011. "Resolving structural uncertainty in natural resources management using POMDP approaches," Ecological Modelling, Elsevier, vol. 222(5), pages 1092-1102.
    18. Lyn C. Thomas & James N. Eagle, 1995. "Criteria and approximate methods for path‐constrained moving‐target search problems," Naval Research Logistics (NRL), John Wiley & Sons, vol. 42(1), pages 27-38, February.
    19. Otten, Maarten & Timmer, Judith & Witteveen, Annemieke, 2020. "Stratified breast cancer follow-up using a continuous state partially observable Markov decision process," European Journal of Operational Research, Elsevier, vol. 281(2), pages 464-474.
    20. Zehra Önen Dumlu & Serpil Sayın & İbrahim Hakan Gürvit, 2023. "Screening for preclinical Alzheimer’s disease: Deriving optimal policies using a partially observable Markov model," Health Care Management Science, Springer, vol. 26(1), pages 1-20, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:oropre:v:60:y:2012:i:5:p:1019-1034. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.