Optimal Control of Noisy Finite-State Markov Processes

Adrian Segall

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

This paper treats the problem of optimal control of finite-state Markov processes observed in noise. Two types of noisy observations are considered: additive white Gaussian noise and jump-type observations. Sufficient conditions for the optimality of a control law are obtained similar to the stochastic Hamilton-Jacobi equation for perfectly observed Markov processes. An illustrative example concludes the paper.

Original languageEnglish
Pages (from-to)179-186
Number of pages8
JournalIEEE Transactions on Automatic Control
Volume22
Issue number2
DOIs
StatePublished - Apr 1977
Externally publishedYes

Fingerprint

Dive into the research topics of 'Optimal Control of Noisy Finite-State Markov Processes'. Together they form a unique fingerprint.

Cite this