1 research outputs found
Fully discrete schemes for monotone optimal control problems
In this article we study a finite horizon optimal control problem with
monotone controls. We consider the associated Hamilton-Jacobi-Bellman (HJB)
equation which characterizes the value function.
We consider the totally discretized problem by using the finite element
method to approximate the state space . The obtained problem is
equivalent to the resolution of a finite sequence of stopping-time problems.
The convergence orders of these approximations are proved, which are in
general where is the H\"older constant
of the value function . A special election of the relations between the
parameters and allows to obtain a convergence of order
, which is valid without semiconcavity hypotheses over
the problem's data.
We show also some numerical implementations in an example