Optimality for Controlled Jump Processes: A Simple Approach
This note presents a very simple method for deriving the necessary optimality conditions for optimal control of jump (point) processes. By means of Bellman's principle of optimality, the original stochastic control problem is transformed into a simple optimization problem. The derivation is remarkably simpler than the existing ones in the literature.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
Volume (Year): 3 (1993)
Issue (Month): 4 (October)
|Contact details of provider:|| Web page: http://link.springer.de/link/service/journals/00199/index.htm|
|Order Information:||Web: http://link.springer.de/orders.htm|
When requesting a correction, please mention this item's handle: RePEc:spr:joecth:v:3:y:1993:i:4:p:765-74. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Guenther Eichhorn)or (Christopher F Baum)
If references are entirely missing, you can add them using this form.