77 research outputs found

    Data-Driven Chance Constrained Optimization under Wasserstein Ambiguity Sets

    Get PDF
    We present a data-driven approach for distributionally robust chance constrained optimization problems (DRCCPs). We consider the case where the decision maker has access to a finite number of samples or realizations of the uncertainty. The chance constraint is then required to hold for all distributions that are close to the empirical distribution constructed from the samples (where the distance between two distributions is defined via the Wasserstein metric). We first reformulate DRCCPs under data-driven Wasserstein ambiguity sets and a general class of constraint functions. When the feasibility set of the chance constraint program is replaced by its convex inner approximation, we present a convex reformulation of the program and show its tractability when the constraint function is affine in both the decision variable and the uncertainty. For constraint functions concave in the uncertainty, we show that a cutting-surface algorithm converges to an approximate solution of the convex inner approximation of DRCCPs. Finally, for constraint functions convex in the uncertainty, we compare the feasibility set with other sample-based approaches for chance constrained programs.Comment: A shorter version is submitted to the American Control Conference, 201

    Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization under Wasserstein Ambiguity Sets

    Get PDF
    We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the distributionally robust versions of these problems, where the constraints are required to hold for a family of distributions constructed from the observed realizations of the uncertainty via the Wasserstein distance. Our main results establish that if the samples are drawn independently from an underlying distribution and the problems satisfy suitable technical assumptions, then the optimal value and optimizers of the distributionally robust versions of these problems converge to the respective quantities of the original problems, as the sample size increases

    Wasserstein distributionally robust risk-constrained iterative MPC for motion planning: computationally efficient approximations

    Full text link
    This paper considers a risk-constrained motion planning problem and aims to find the solution combining the concepts of iterative model predictive control (MPC) and data-driven distributionally robust (DR) risk-constrained optimization. In the iterative MPC, at each iteration, safe states visited and stored in the previous iterations are imposed as terminal constraints. Furthermore, samples collected during the iteration are used in the subsequent iterations to tune the ambiguity set of the DR constraints employed in the MPC. In this method, the MPC problem becomes computationally burdensome when the iteration number goes high. To overcome this challenge, the emphasis of this paper is to reduce the real-time computational effort using two approximations. First one involves clustering of data at the beginning of each iteration and modifying the ambiguity set for the MPC scheme so that safety guarantees still holds. The second approximation considers determining DR-safe regions at the start of iteration and constraining the state in the MPC scheme to such safe sets. We analyze the computational tractability of these approximations and present a simulation example that considers path planning in the presence of randomly moving obstacle.Comment: 8 pages, 6 figures, Proceedings of the IEEE Conference on Decision and Control, Singapore, 202
    • …
    corecore