232 research outputs found

    Constrained Optimization of Rank-One Functions with Indicator Variables

    Full text link
    Optimization problems involving minimization of a rank-one convex function over constraints modeling restrictions on the support of the decision variables emerge in various machine learning applications. These problems are often modeled with indicator variables for identifying the support of the continuous variables. In this paper we investigate compact extended formulations for such problems through perspective reformulation techniques. In contrast to the majority of previous work that relies on support function arguments and disjunctive programming techniques to provide convex hull results, we propose a constructive approach that exploits a hidden conic structure induced by perspective functions. To this end, we first establish a convex hull result for a general conic mixed-binary set in which each conic constraint involves a linear function of independent continuous variables and a set of binary variables. We then demonstrate that extended representations of sets associated with epigraphs of rank-one convex functions over constraints modeling indicator relations naturally admit such a conic representation. This enables us to systematically give perspective formulations for the convex hull descriptions of these sets with nonlinear separable or non-separable objective functions, sign constraints on continuous variables, and combinatorial constraints on indicator variables. We illustrate the efficacy of our results on sparse nonnegative logistic regression problems

    Perspective reformulations of the CTA problem with L_2 distances

    Get PDF
    Any institution that disseminates data in aggregated form h as the duty to ensure that individual confidential information is not disclosed, either by not releasing data or by perturbing the released data, while maintaining data utility. Controlled tabular adjustment (CTA) is a promising technique of the second type where a protected table that is close to the original one in some chosen distance is constructed. The choice of the specific distance shows a trade-off: while the Euclidean distance has been shown (and is confirmed here) to produce tables with greater “utility”, it gives rise to Mixed Integer Quadratic Problems (MIQPs) with pairs of linked semi-continuous variables that are more difficult to solve than the Mixed Integer Linear Problems corresponding to linear norms. We provide a novel analysis of Perspective Reformulations (PRs) for this special structure; in particular, we devise a Projected PR (P2 R) which is piecewiseconic but simplifies to a (nonseparable) MIQP when the instance is symmetric. We then compare different formulations of the CTA problem, show ing that the ones based on P2 R most often obtain better computational results.Preprin

    Certification of Real Inequalities -- Templates and Sums of Squares

    Full text link
    We consider the problem of certifying lower bounds for real-valued multivariate transcendental functions. The functions we are dealing with are nonlinear and involve semialgebraic operations as well as some transcendental functions like cos\cos, arctan\arctan, exp\exp, etc. Our general framework is to use different approximation methods to relax the original problem into polynomial optimization problems, which we solve by sparse sums of squares relaxations. In particular, we combine the ideas of the maxplus estimators (originally introduced in optimal control) and of the linear templates (originally introduced in static analysis by abstract interpretation). The nonlinear templates control the complexity of the semialgebraic relaxations at the price of coarsening the maxplus approximations. In that way, we arrive at a new - template based - certified global optimization method, which exploits both the precision of sums of squares relaxations and the scalability of abstraction methods. We analyze the performance of the method on problems from the global optimization literature, as well as medium-size inequalities issued from the Flyspeck project.Comment: 27 pages, 3 figures, 4 table

    Distributionally Robust Optimization: A Review

    Full text link
    The concepts of risk-aversion, chance-constrained optimization, and robust optimization have developed significantly over the last decade. Statistical learning community has also witnessed a rapid theoretical and applied growth by relying on these concepts. A modeling framework, called distributionally robust optimization (DRO), has recently received significant attention in both the operations research and statistical learning communities. This paper surveys main concepts and contributions to DRO, and its relationships with robust optimization, risk-aversion, chance-constrained optimization, and function regularization
    corecore