Abstract
This paper treats the problem of optimal control of finite-state Markov processes observed in noise. Two types of noisy observations are considered: additive white Gaussian noise and jump-type observations. Sufficient conditions for the optimality of a control law are obtained similar to the stochastic Hamilton-Jacobi equation for perfectly observed Markov processes. An illustrative example concludes the paper.
Original language | English |
---|---|
Pages (from-to) | 179-186 |
Number of pages | 8 |
Journal | IEEE Transactions on Automatic Control |
Volume | 22 |
Issue number | 2 |
DOIs | |
State | Published - Apr 1977 |
Externally published | Yes |