If you need an accessible version of this item please contact JSTOR User Support

An Equivalence between Continuous and Discrete Time Markov Decision Processes

Richard F. Serfozo
Operations Research
Vol. 27, No. 3 (May - Jun., 1979), pp. 616-620
Published by: INFORMS
Stable URL: http://www.jstor.org/stable/170221
Page Count: 5
  • Download PDF
  • Cite this Item

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support
An Equivalence between Continuous and Discrete Time Markov Decision Processes
Preview not available

Abstract

A continuous time Markov decision process with uniformly bounded transition rates is shown to be equivalent to a simpler discrete time Markov decision process for both the discounted and average reward criteria on an infinite horizon. This result clarifies some earlier work in this area.

Page Thumbnails

  • Thumbnail: Page 
616
    616
  • Thumbnail: Page 
617
    617
  • Thumbnail: Page 
618
    618
  • Thumbnail: Page 
619
    619
  • Thumbnail: Page 
620
    620