Partially Observed Differential Games, Infinite Dimensional HJI Equations, and Nonlinear H∞ Control
Authors :
James, Matthew R
Conference : SIAM Journal on Control and Optimization Vol. 34, No. 4, pp. 1342-1364
Date: July 01 - July 01, 1996
This paper presents new results for partially observed nonlinear differential games. Using the concept of information state, we solve this problem in terms of an infinite-dimensional partial differential equation, which turns out to be the Hamilton-Jacobi-Isaacs (HJI) equation for partially observed differential games. We give definitions of smooth and viscosity solutions and prove that the value function is a viscosity solution of the HJI equation. We prove a verification theorem, which implies that the optimal controls are separated in that they depend on the observations through the information state. This constitutes a separation principle for partially observed differential games. We also present some new results concerning the certainty equivalence principle under certain standard assumptions. Our results are applied to a nonlinear output feedback H∞ robust control problem.