IDEAS home Printed from https://ideas.repec.org/a/eee/jomega/v36y2008i3p477-485.html
   My bibliography  Save this article

Using partially observed Markov processes to select optimal termination time of TV shows

Author

Listed:
  • Givon, Moshe
  • Grosfeld-Nir, Abraham

Abstract

This paper presents a method for optimal control of a running television show. The problem is formulated as a partially observed Markov decision process (POMDP). A show can be in a "good" state, i.e., it should be continued, or it can be in a "bad" state and therefore it should be changed. The ratings of a show are modeled as a stochastic process that depends on the show's state. An optimal rule for a continue/change decision, which maximizes the expected present value of profits from selling advertising time, is expressed in terms of the prior probability of the show being in the good state. The optimal rule depends on the size of the investment in changing a show, the difference in revenues between a "good" and a "bad" show and the number of time periods remaining until the end of the planning horizon. The application of the method is illustrated with simulated ratings as well as real data.

Suggested Citation

  • Givon, Moshe & Grosfeld-Nir, Abraham, 2008. "Using partially observed Markov processes to select optimal termination time of TV shows," Omega, Elsevier, vol. 36(3), pages 477-485, June.
  • Handle: RePEc:eee:jomega:v:36:y:2008:i:3:p:477-485
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0305-0483(06)00020-X
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jeffrey H. Horen, 1980. "Scheduling of Network Television Programs," Management Science, INFORMS, vol. 26(4), pages 354-370, April.
    2. Roland T. Rust & Mark I. Alpert, 1984. "An Audience Flow Model of Television Viewing Choice," Marketing Science, INFORMS, vol. 3(2), pages 113-124.
    3. William S. Lovejoy, 1987. "Ordered Solutions for Dynamic Programs," Mathematics of Operations Research, INFORMS, vol. 12(2), pages 269-276, May.
    4. S. Christian Albright, 1979. "Structural Results for Partially Observable Markov Decision Processes," Operations Research, INFORMS, vol. 27(5), pages 1041-1053, October.
    5. Richard D. Smallwood & Edward J. Sondik, 1973. "The Optimal Control of Partially Observable Markov Processes over a Finite Horizon," Operations Research, INFORMS, vol. 21(5), pages 1071-1088, October.
    6. James T. Treharne & Charles R. Sox, 2002. "Adaptive Inventory Control for Nonstationary Demand and Partial Information," Management Science, INFORMS, vol. 48(5), pages 607-624, May.
    7. Abraham Grosfeld-Nir, 1996. "A Two-State Partially Observable Markov Decision Process with Uniformly Distributed Observations," Operations Research, INFORMS, vol. 44(3), pages 458-463, June.
    8. Douglas J. White, 1985. "Real Applications of Markov Decision Processes," Interfaces, INFORMS, vol. 15(6), pages 73-83, December.
    9. George E. Monahan, 1982. "State of the Art---A Survey of Partially Observable Markov Decision Processes: Theory, Models, and Algorithms," Management Science, INFORMS, vol. 28(1), pages 1-16, January.
    10. Daniel E. Lane, 1989. "A Partially Observable Model of Decision Making by Fishermen," Operations Research, INFORMS, vol. 37(2), pages 240-254, April.
    11. James N. Eagle, 1984. "The Optimal Search for a Moving Target When the Search Path Is Constrained," Operations Research, INFORMS, vol. 32(5), pages 1107-1115, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Danaher, Peter & Dagger, Tracey, 2012. "Using a nested logit model to forecast television ratings," International Journal of Forecasting, Elsevier, vol. 28(3), pages 607-622.
    2. Pérez-Gladish, B. & Gonzalez, I. & Bilbao-Terol, A. & Arenas-Parra, M., 2010. "Planning a TV advertising campaign: A crisp multiobjective programming model from fuzzy basic data," Omega, Elsevier, vol. 38(1-2), pages 84-94, February.
    3. Danaher, Peter J. & Dagger, Tracey S. & Smith, Michael S., 2011. "Forecasting television ratings," International Journal of Forecasting, Elsevier, vol. 27(4), pages 1215-1240, October.
    4. Chernonog, Tatyana & Avinadav, Tal, 2016. "A two-state partially observable Markov decision process with three actionsAuthor-Name: Ben-Zvi, Tal," European Journal of Operational Research, Elsevier, vol. 254(3), pages 957-967.
    5. Ballings, Michel & Van den Poel, Dirk & Bogaert, Matthias, 2016. "Social media optimization: Identifying an optimal strategy for increasing network size on Facebook," Omega, Elsevier, vol. 59(PA), pages 15-25.
    6. Abraham Grosfeld‐Nir & Eyal Cohen & Yigal Gerchak, 2007. "Production to order and off‐line inspection when the production process is partially observable," Naval Research Logistics (NRL), John Wiley & Sons, vol. 54(8), pages 845-858, December.
    7. Chiel van Oosterom & Lisa M. Maillart & Jeffrey P. Kharoufeh, 2017. "Optimal maintenance policies for a safety‐critical system and its deteriorating sensor," Naval Research Logistics (NRL), John Wiley & Sons, vol. 64(5), pages 399-417, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chernonog, Tatyana & Avinadav, Tal, 2016. "A two-state partially observable Markov decision process with three actionsAuthor-Name: Ben-Zvi, Tal," European Journal of Operational Research, Elsevier, vol. 254(3), pages 957-967.
    2. Grosfeld-Nir, Abraham, 2007. "Control limits for two-state partially observable Markov decision processes," European Journal of Operational Research, Elsevier, vol. 182(1), pages 300-304, October.
    3. Hao Zhang, 2010. "Partially Observable Markov Decision Processes: A Geometric Technique and Analysis," Operations Research, INFORMS, vol. 58(1), pages 214-228, February.
    4. Shoshana Anily & Abraham Grosfeld-Nir, 2006. "An Optimal Lot-Sizing and Offline Inspection Policy in the Case of Nonrigid Demand," Operations Research, INFORMS, vol. 54(2), pages 311-323, April.
    5. Zong-Zhi Lin & James C. Bean & Chelsea C. White, 2004. "A Hybrid Genetic/Optimization Algorithm for Finite-Horizon, Partially Observed Markov Decision Processes," INFORMS Journal on Computing, INFORMS, vol. 16(1), pages 27-38, February.
    6. Abraham Grosfeld‐Nir & Eyal Cohen & Yigal Gerchak, 2007. "Production to order and off‐line inspection when the production process is partially observable," Naval Research Logistics (NRL), John Wiley & Sons, vol. 54(8), pages 845-858, December.
    7. Williams, Byron K., 2009. "Markov decision processes in natural resources management: Observability and uncertainty," Ecological Modelling, Elsevier, vol. 220(6), pages 830-840.
    8. Chiel van Oosterom & Lisa M. Maillart & Jeffrey P. Kharoufeh, 2017. "Optimal maintenance policies for a safety‐critical system and its deteriorating sensor," Naval Research Logistics (NRL), John Wiley & Sons, vol. 64(5), pages 399-417, August.
    9. Malek Ebadi & Raha Akhavan-Tabatabaei, 2021. "Personalized Cotesting Policies for Cervical Cancer Screening: A POMDP Approach," Mathematics, MDPI, vol. 9(6), pages 1-20, March.
    10. Yanling Chang & Alan Erera & Chelsea White, 2015. "A leader–follower partially observed, multiobjective Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 103-128, December.
    11. Williams, Byron K., 2011. "Resolving structural uncertainty in natural resources management using POMDP approaches," Ecological Modelling, Elsevier, vol. 222(5), pages 1092-1102.
    12. Turgay Ayer & Oguzhan Alagoz & Natasha K. Stout & Elizabeth S. Burnside, 2016. "Heterogeneity in Women’s Adherence and Its Role in Optimal Breast Cancer Screening Policies," Management Science, INFORMS, vol. 62(5), pages 1339-1362, May.
    13. Turgay Ayer & Oguzhan Alagoz & Natasha K. Stout, 2012. "OR Forum---A POMDP Approach to Personalize Mammography Screening Decisions," Operations Research, INFORMS, vol. 60(5), pages 1019-1034, October.
    14. Yossi Aviv & Amit Pazgal, 2005. "A Partially Observed Markov Decision Process for Dynamic Pricing," Management Science, INFORMS, vol. 51(9), pages 1400-1416, September.
    15. Ali Hajjar & Oguzhan Alagoz, 2023. "Personalized Disease Screening Decisions Considering a Chronic Condition," Management Science, INFORMS, vol. 69(1), pages 260-282, January.
    16. Armando Z. Milioni & Stanley R. Pliska, 1988. "Optimal inspection under semi‐markovian deterioration: Basic results," Naval Research Logistics (NRL), John Wiley & Sons, vol. 35(5), pages 373-392, October.
    17. Abhijit Gosavi, 2009. "Reinforcement Learning: A Tutorial Survey and Recent Advances," INFORMS Journal on Computing, INFORMS, vol. 21(2), pages 178-192, May.
    18. Yanling Chang & Alan Erera & Chelsea White, 2015. "Value of information for a leader–follower partially observed Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 129-153, December.
    19. Anyan Qi & Hyun-Soo Ahn & Amitabh Sinha, 2017. "Capacity Investment with Demand Learning," Operations Research, INFORMS, vol. 65(1), pages 145-164, February.
    20. Ciriaco Valdez‐Flores & Richard M. Feldman, 1989. "A survey of preventive maintenance models for stochastically deteriorating single‐unit systems," Naval Research Logistics (NRL), John Wiley & Sons, vol. 36(4), pages 419-446, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jomega:v:36:y:2008:i:3:p:477-485. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/375/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.