Advanced Search
MyIDEAS: Login

A Utility Criterion for Markov Decision Processes

Contents:

Author Info

  • Stratton C. Jaquette

    (Systems Control, Inc., Palo Alto)

Registered author(s):

    Abstract

    Optimality criteria for Markov decision processes have historically been based on a risk neutral formulation of the decision maker's preferences. An explicit utility formulation, incorporating both risk and time preference and based on some results in the axiomatic theory of choice under uncertainty, is developed. This forms an optimality criterion called utility optimality with constant aversion to risk. The objective is to maximize the expected utility using an exponential utility function. Implicit in the formulation is an interpretation of the decision process which is not sequential. It is shown that optimal policies exist which are not necessarily stationary for an infinite horizon stationary Markov decision process with finite state and action spaces. An example is given.

    Download Info

    If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
    File URL: http://dx.doi.org/10.1287/mnsc.23.1.43
    Download Restriction: no

    Bibliographic Info

    Article provided by INFORMS in its journal Management Science.

    Volume (Year): 23 (1976)
    Issue (Month): 1 (September)
    Pages: 43-49

    as in new window
    Handle: RePEc:inm:ormnsc:v:23:y:1976:i:1:p:43-49

    Contact details of provider:
    Postal: 7240 Parkway Drive, Suite 300, Hanover, MD 21076 USA
    Phone: +1-443-757-3500
    Fax: 443-757-3515
    Email:
    Web page: http://www.informs.org/
    More information through EDIRC

    Related research

    Keywords:

    References

    No references listed on IDEAS
    You can help add them by filling out this form.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as in new window

    Cited by:
    1. Takayuki Osogami, 2012. "Iterated risk measures for risk-sensitive Markov decision processes with discounted cost," Papers 1202.3755, arXiv.org.
    2. Krishnamurthy Iyer & Nandyala Hemachandra, 2010. "Sensitivity analysis and optimal ultimately stationary deterministic policies in some constrained discounted cost models," Computational Statistics, Springer, vol. 71(3), pages 401-425, June.
    3. Rolando Cavazos-Cadena, 2009. "Solutions of the average cost optimality equation for finite Markov decision chains: risk-sensitive and risk-neutral criteria," Computational Statistics, Springer, vol. 70(3), pages 541-566, December.
    4. Rolando Cavazos-Cadena, 2010. "Optimality equations and inequalities in a class of risk-sensitive average cost Markov decision chains," Computational Statistics, Springer, vol. 71(1), pages 47-84, February.
    5. Monahan, George E. & Sobel, Matthew J., 1997. "Risk-Sensitive Dynamic Market Share Attraction Games," Games and Economic Behavior, Elsevier, vol. 20(2), pages 149-160, August.
    6. Karel Sladk√Ĺ, 2013. "Risk-Sensitive and Mean Variance Optimality in Markov Decision Processes," Czech Economic Review, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, vol. 7(3), pages 146-161, November.

    Lists

    This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

    Statistics

    Access and download statistics

    Corrections

    When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:23:y:1976:i:1:p:43-49. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Mirko Janc).

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If references are entirely missing, you can add them using this form.

    If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.