1,832 research outputs found

    Stochastic Verification Theorem of Forward-Backward Controlled Systems for Viscosity Solutions

    Full text link
    In this paper, we investigate the controlled system described by forward-backward stochastic differential equations with the control contained in drift, diffusion and generator of BSDE. A new verification theorem is derived within the framework of viscosity solutions without involving any derivatives of the value functions. It is worth to pointing out that this theorem has wider applicability than the restrictive classical verification theorems. As a relevant problem, the optimal stochastic feedback controls for forward-backward system are discussed as well

    Verification Theorems for Stochastic Optimal Control Problems via a Time Dependent Fukushima - Dirichlet Decomposition

    Get PDF
    This paper is devoted to present a method of proving verification theorems for stochastic optimal control of finite dimensional diffusion processes without control in the diffusion term. The value function is assumed to be continuous in time and once differentiable in the space variable (C0,1C^{0,1}) instead of once differentiable in time and twice in space (C1,2C^{1,2}), like in the classical results. The results are obtained using a time dependent Fukushima - Dirichlet decomposition proved in a companion paper by the same authors using stochastic calculus via regularization. Applications, examples and comparison with other similar results are also given.Comment: 34 pages. To appear: Stochastic Processes and Their Application

    Controlled diffusion processes

    Full text link
    This article gives an overview of the developments in controlled diffusion processes, emphasizing key results regarding existence of optimal controls and their characterization via dynamic programming for a variety of cost criteria and structural assumptions. Stochastic maximum principle and control under partial observations (equivalently, control of nonlinear filters) are also discussed. Several other related topics are briefly sketched.Comment: Published at http://dx.doi.org/10.1214/154957805100000131 in the Probability Surveys (http://www.i-journals.org/ps/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Stochastic Target Games and Dynamic Programming via Regularized Viscosity Solutions

    Full text link
    We study a class of stochastic target games where one player tries to find a strategy such that the state process almost-surely reaches a given target, no matter which action is chosen by the opponent. Our main result is a geometric dynamic programming principle which allows us to characterize the value function as the viscosity solution of a non-linear partial differential equation. Because abstract mea-surable selection arguments cannot be used in this context, the main obstacle is the construction of measurable almost-optimal strategies. We propose a novel approach where smooth supersolutions are used to define almost-optimal strategies of Markovian type, similarly as in ver-ification arguments for classical solutions of Hamilton--Jacobi--Bellman equations. The smooth supersolutions are constructed by an exten-sion of Krylov's method of shaken coefficients. We apply our results to a problem of option pricing under model uncertainty with different interest rates for borrowing and lending.Comment: To appear in MO
    • …
    corecore