Dynamic Programming Principle and Hamilton-Jacobi Equation for Optimal Control Problems with Uncertainty
Palabras clave:
Dynamic Programming, Hamilton-Jacobi Equation, Optimal Control, Uncertainty, Hilbert SpaceResumen
This work establishes that the value function for Mayer’s problem in a Hilbert space is the unique lower semi-continuous solution to the Hamilton-Jacobi-Bellman (HJB) equation under specific conditions. By investigating a parametrized Riemann–Stieltjes problem, we achieve the compactness of its trajectories, which, combined with a characterization of the lower semicontinuity of the associated value function, establishes the existence of optimal controls. Subsequently, utilizing the differential inclusion approach and prior results, we prove the uniqueness of the solution to the HJB equation.
Descargas
Citas
T. Donchev. “Properties of the reachable set of control systems”. Em: Systems & Control Letters - SCL 46 (2002).
H. Frankowska. “A priori estimates for operational differential inclusions”. Em: Journal of Differential Equations 84.1 (1990), pp. 100–128.
X. Li e J. Yong. Optimal Control Theory for Infinite Dimensional Systems. Systems & Control: Foundations & Applications. Birkhäuser Boston, 2012.