3,702 research outputs found

    Distributionally Robust Optimization: A Review

    Full text link
    The concepts of risk-aversion, chance-constrained optimization, and robust optimization have developed significantly over the last decade. Statistical learning community has also witnessed a rapid theoretical and applied growth by relying on these concepts. A modeling framework, called distributionally robust optimization (DRO), has recently received significant attention in both the operations research and statistical learning communities. This paper surveys main concepts and contributions to DRO, and its relationships with robust optimization, risk-aversion, chance-constrained optimization, and function regularization

    Theory and Applications of Robust Optimization

    Full text link
    In this paper we survey the primary research, both theoretical and applied, in the area of Robust Optimization (RO). Our focus is on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying prominent theoretical results of RO, we also present some recent results linking RO to adaptable models for multi-stage decision-making problems. Finally, we highlight applications of RO across a wide spectrum of domains, including finance, statistics, learning, and various areas of engineering.Comment: 50 page

    Data-driven chance constrained programs over wasserstein balls

    Get PDF
    We provide an exact deterministic reformulation for data-driven, chance-constrained programs over Wasserstein balls. For individual chance constraints as well as joint chance constraints with right-hand-side uncertainty, our reformulation amounts to a mixed-integer conic program. In the special case of a Wasserstein ball with the 1-norm or the ∞-norm, the cone is the nonnegative orthant, and the chance-constrained program can be reformulated as a mixed-integer linear program. Our reformulation compares favorably to several state-of-the-art data-driven optimization schemes in our numerical experiments

    Data-Driven Chance Constrained Programs over Wasserstein Balls

    Full text link
    We provide an exact deterministic reformulation for data-driven chance constrained programs over Wasserstein balls. For individual chance constraints as well as joint chance constraints with right-hand side uncertainty, our reformulation amounts to a mixed-integer conic program. In the special case of a Wasserstein ball with the 11-norm or the ∞\infty-norm, the cone is the nonnegative orthant, and the chance constrained program can be reformulated as a mixed-integer linear program. Our reformulation compares favourably to several state-of-the-art data-driven optimization schemes in our numerical experiments.Comment: 25 pages, 9 figure

    Multi-Stage Decision Rules for Power Generation & Storage Investments with Performance Guarantees

    Full text link
    We develop multi-stage linear decision rules (LDRs) for dynamic power system generation and energy storage investment planning under uncertainty and propose their chance-constrained optimization with performance guarantees. First, the optimized LDRs guarantee operational and carbon policy feasibility of the resulting dynamic investment plan even when the planning uncertainty distribution is ambiguous. Second, the optimized LDRs internalize the tolerance of the system planner towards the stochasticity (variance) of uncertain investment outcomes. They can eventually produce a quasi-deterministic investment plan, which is insensitive to uncertainty (as in deterministic planning) but robust to its realizations (as in stochastic planning). Last, we certify the performance of the optimized LDRs with the bound on their sub-optimality due to their linear functional form. Using this bound, we guarantee that the preference of LDRs over less restrictive -- yet poorly scalable -- scenario-based optimization does not lead to financial losses exceeding this bound. We use a testbed of the U.S. Southeast power system to reveal the trade-offs between the cost, stochasticity, and feasibility of LDR-based investments. We also conclude that the LDR sub-optimality depends on the amount of uncertainty and the tightness of chance constraints on operational, investment and policy variables

    Convex Nonlinear and Integer Programming Approaches for Distributionally Robust Optimization of Complex Systems

    Full text link
    The primary focus of the dissertation is to develop distributionally robust optimization (DRO) models and related solution approaches for decision making in energy and healthcare service systems with uncertainties, which often involves nonlinear constraints and discrete decision variables. Without assuming specific distributions, DRO techniques solve for solutions against the worst-case distribution of system uncertainties. In the DRO framework, we consider both risk-neutral (e.g., expectation) and risk-averse (e.g., chance constraint and Conditional Value-at-Risk (CVaR)) measures. The aim is twofold: i) developing efficient solution algorithms for DRO models with integer and/or binary variables, sometimes nonlinear structures and ii) revealing managerial insights of DRO models for specific applications. We mainly focus on DRO models of power system operations, appointment scheduling, and resource allocation in healthcare. Specifically, we first study stochastic optimal power flow (OPF), where (uncertain) renewable integration and load control are implemented to balance supply and (uncertain) demand in power grids. We propose a chance-constrained OPF (CC-OPF) model and investigate its DRO variant which is reformulated as a semidefinite programming (SDP) problem. We compare the DRO model with two benchmark models, in the IEEE 9-bus, 39-bus, and 118-bus systems with different flow congestion levels. The DRO approach yields a higher probability of satisfying the chance constraints and shorter solution time. It also better utilizes reserves at both generators and loads when the system has congested flows. Then we consider appointment scheduling under random service durations with given (fixed) appointment arrival order. We propose a DRO formulation and derive a conservative SDP reformulation. Furthermore, we study a scheduling variant under random no-shows of appointments and derive tractable reformulations for certain beliefs of no-show patterns. One preceding problem of appointment scheduling in the healthcare service operations is the surgery block allocation problem that assigns surgeries to operating rooms. We derive an equivalent 0-1 SDP reformulation and a less conservative 0-1 second-order cone programming (SOCP) reformulation for its DRO model. Finally, we study distributionally robust chance-constrained binary programs (DCBP) for limiting the probability of undesirable events, under mean-covariance information. We reformulate DCBPs as equivalent 0-1 SOCP formulations under two moment-based ambiguity sets. We further exploit the submodularity of the 0-1 SOCP reformulations under diagonal and non-diagonal matrices. We derive extended polymatroid inequalities via submodularity and lifting, which are incorporated into a branch-and-cut algorithm incorporated for efficiently solving DCBPs. We demonstrate the computational efficacy and solution performance with diverse instances of a chance-constrained bin packing problem.PHDIndustrial & Operations EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/149946/1/zyiling_1.pd

    STREAM WATER QUALITY MANAGEMENT: A STOCHASTIC MIXED-INTEGER PROGRAMMING MODEL

    Get PDF
    Water quality management under the watershed approach of Total Maximum Daily Load (TMDL) programs requires that water quality standards be maintained throughout the year. The main purpose of this research was to develop a methodology that incorporates inter-temporal variations in stream conditions through statistical distributions of pollution loading variables. This was demonstrated through a cost minimization mixed-integer linear programming (MIP) model that maintains the spatial integrity of the watershed problem. Traditional approaches for addressing variability in stream conditions are unlikely to satisfy the assumptions on which these methodologies are founded or are inadequate in addressing the problem correctly when distributions are not normal. The MIP model solves for the location and the maximum capacity of treatment plants to be built throughout the watershed which will provide the optimal level of treatment throughout the year. The proposed methodology involves estimation of parameters of the distribution of pollution loading variables from simulated data and use of those parameters to re-generate a suitable number of random observations in the optimization process such that the new data preserve the same distribution parameters. The objective of the empirical model was to minimize costs for implementing pH TMDLs for a watershed by determining the level of treatment required to attain water quality standards under stochastic stream conditions. The output of the model was total minimum costs for treatment and selection of the spatial pattern of the least-cost technologies for treatment. To minimize costs, the model utilized a spatial network of streams in the watershed, which provides opportunities for cost-reduction through trading of pollution among sources and/or least-cost treatment. The results were used to estimate the costs attributable to inter-temporal variations and the costs of different settings for the margin of safety. The methodology was tested with water quality data for the Paint Creek watershed in West Virginia. The stochastic model included nine streams in the optimal solution. An estimate of inter-temporal variations in stream conditions was calculated by comparing total costs under the stochastic model and a deterministic version of the stochastic model estimated with mean values of the loading variables. It was observed that the deterministic model underestimates total treatment cost by about 45 percent relative to the 97th percentile stochastic model. Estimates of different margin of safety were calculated by comparing total costs for the 99.9th percentile treatment (instead of an idealistic absolute treatment) with that of the 95th to 99th percentile treatment. The differential costs represent the savings due to the knowledge of the statistical distribution of pollution and an explicit margin of safety. Results indicate that treatment costs are about 7 percent lower when the level of assurance is reduced from 99.9 to 99 percent and 21 percent lower when 95 percent assurance is selected. The application of the methodology, however, is not limited to the estimation of TMDL implementation costs. For example, it could be utilized to estimate costs of anti-degradation policies for water quality management and other watershed management issues.Resource /Energy Economics and Policy,
    • …
    corecore