A Stochastic Optimal Control Strategy for Partially Observable Nonlinear Systems
Abstract
A stochastic optimal control strategy for partially observable nonlinear systems is proposed. The optimal control force consists of two parts. The first part is determined by the conditions under which the stochastic optimal control problem of a partially observable nonlinear system is converted into that of a completely observable linear system. The second part is determined by solving the dynamical programming equation derived by applying the stochastic averaging method and stochastic dynamical programming principle to the completely observable linear control system. For controlled quasi Hamiltonian systems, the response of the optimally controlled system is predicted by solving the averaged Fokker-Planck-Kolmogorov equation associated with the optimally controlled completely observable linear system and solving the Riccati equation for the estimate error of system states.