Hamilton-Jacobi-Bellman equations for the optimal control of dynamical systems with delay
- Series
- PDE Seminar
- Time
- Tuesday, January 8, 2013 - 15:05 for 1 hour (actually 50 minutes)
- Location
- Skiles 006
- Speaker
- Fausto Gozzi – LUISS University, Rome, Italy
In this talk we first present some applied examples (coming from
Economics and Finance) of
Optimal Control Problems for Dynamical Systems with Delay (deterministic
and stochastic).
To treat such problems with the so called Dynamic Programming Approach
one has to study a class of infinite dimensional HJB equations for which
the existing theory does not apply
due to their specific features (presence of state constraints, presence
of first order differential operators in the state equation, possible
unboundedness of the control operator).
We will present some results on the existence of regular solutions for
such equations and on existence of optimal control in feedback form.