217,324 research outputs found

    Strong-weak Stackelberg Problems in Finite Dimensional Spaces

    Get PDF
    We are concerned with two-level optimization problems called strongweak Stackelberg problems, generalizing the class of Stackelberg problems in the strong and weak sense. In order to handle the fact that the considered two-level optimization problems may fail to have a solution under mild assumptions, we consider a regularization involving ε-approximate optimal solutions in the lower level problems. We prove the existence of optimal solutions for such regularized problems and present some approximation results when the parameter ǫ goes to zero. Finally, as an example, we consider an optimization problem associated to a best bound given in [2] for a system of nondifferentiable convex inequalities

    Co-evolutionary Hybrid Bi-level Optimization

    Get PDF
    Multi-level optimization stems from the need to tackle complex problems involving multiple decision makers. Two-level optimization, referred as ``Bi-level optimization'', occurs when two decision makers only control part of the decision variables but impact each other (e.g., objective value, feasibility). Bi-level problems are sequential by nature and can be represented as nested optimization problems in which one problem (the ``upper-level'') is constrained by another one (the ``lower-level''). The nested structure is a real obstacle that can be highly time consuming when the lower-level is NPhard\mathcal{NP}-hard. Consequently, classical nested optimization should be avoided. Some surrogate-based approaches have been proposed to approximate the lower-level objective value function (or variables) to reduce the number of times the lower-level is globally optimized. Unfortunately, such a methodology is not applicable for large-scale and combinatorial bi-level problems. After a deep study of theoretical properties and a survey of the existing applications being bi-level by nature, problems which can benefit from a bi-level reformulation are investigated. A first contribution of this work has been to propose a novel bi-level clustering approach. Extending the well-know ``uncapacitated k-median problem'', it has been shown that clustering can be easily modeled as a two-level optimization problem using decomposition techniques. The resulting two-level problem is then turned into a bi-level problem offering the possibility to combine distance metrics in a hierarchical manner. The novel bi-level clustering problem has a very interesting property that enable us to tackle it with classical nested approaches. Indeed, its lower-level problem can be solved in polynomial time. In cooperation with the Luxembourg Centre for Systems Biomedicine (LCSB), this new clustering model has been applied on real datasets such as disease maps (e.g. Parkinson, Alzheimer). Using a novel hybrid and parallel genetic algorithm as optimization approach, the results obtained after a campaign of experiments have the ability to produce new knowledge compared to classical clustering techniques combining distance metrics in a classical manner. The previous bi-level clustering model has the advantage that the lower-level can be solved in polynomial time although the global problem is by definition NP\mathcal{NP}-hard. Therefore, next investigations have been undertaken to tackle more general bi-level problems in which the lower-level problem does not present any specific advantageous properties. Since the lower-level problem can be very expensive to solve, the focus has been turned to surrogate-based approaches and hyper-parameter optimization techniques with the aim of approximating the lower-level problem and reduce the number of global lower-level optimizations. Adapting the well-know bayesian optimization algorithm to solve general bi-level problems, the expensive lower-level optimizations have been dramatically reduced while obtaining very accurate solutions. The resulting solutions and the number of spared lower-level optimizations have been compared to the bi-level evolutionary algorithm based on quadratic approximations (BLEAQ) results after a campaign of experiments on official bi-level benchmarks. Although both approaches are very accurate, the bi-level bayesian version required less lower-level objective function calls. Surrogate-based approaches are restricted to small-scale and continuous bi-level problems although many real applications are combinatorial by nature. As for continuous problems, a study has been performed to apply some machine learning strategies. Instead of approximating the lower-level solution value, new approximation algorithms for the discrete/combinatorial case have been designed. Using the principle employed in GP hyper-heuristics, heuristics are trained in order to tackle efficiently the NPhard\mathcal{NP}-hard lower-level of bi-level problems. This automatic generation of heuristics permits to break the nested structure into two separated phases: \emph{training lower-level heuristics} and \emph{solving the upper-level problem with the new heuristics}. At this occasion, a second modeling contribution has been introduced through a novel large-scale and mixed-integer bi-level problem dealing with pricing in the cloud, i.e., the Bi-level Cloud Pricing Optimization Problem (BCPOP). After a series of experiments that consisted in training heuristics on various lower-level instances of the BCPOP and using them to tackle the bi-level problem itself, the obtained results are compared to the ``cooperative coevolutionary algorithm for bi-level optimization'' (COBRA). Although training heuristics enables to \emph{break the nested structure}, a two phase optimization is still required. Therefore, the emphasis has been put on training heuristics while optimizing the upper-level problem using competitive co-evolution. Instead of adopting the classical decomposition scheme as done by COBRA which suffers from the strong epistatic links between lower-level and upper-level variables, co-evolving the solution and the mean to get to it can cope with these epistatic link issues. The ``CARBON'' algorithm developed in this thesis is a competitive and hybrid co-evolutionary algorithm designed for this purpose. In order to validate the potential of CARBON, numerical experiments have been designed and results have been compared to state-of-the-art algorithms. These results demonstrate that ``CARBON'' makes possible to address nested optimization efficiently

    Approximate Chance-Constrained Unit Commitment Under Wind Energy Penetration

    Get PDF
    We study a multi-period unit commitment problem under wind energy penetration, in which the load balance is enforced with a predefined confidence level across the whole system and over the planning horizon. Since, except for special cases, chance-constrained problems are non-convex, we analyze two relaxations of the load balance based upon robust optimization ideas and estimated quantiles of the marginal distributions of the net load processes. The approximation proposals are benchmarked against the well-known scenario approximation. Under the scenario approach, we also analyze a simple decomposition strategy to find a lower bound of the approximate problem, when the latter becomes intractable due the size of the set of scenarios. The reliability of the obtained solutions as well as their runtimes are examined on three widespread test systems

    Limitations of semidefinite programs for separable states and entangled games

    Get PDF
    Semidefinite programs (SDPs) are a framework for exact or approximate optimization that have widespread application in quantum information theory. We introduce a new method for using reductions to construct integrality gaps for SDPs. These are based on new limitations on the sum-of-squares (SoS) hierarchy in approximating two particularly important sets in quantum information theory, where previously no ω(1)\omega(1)-round integrality gaps were known: the set of separable (i.e. unentangled) states, or equivalently, the 242 \rightarrow 4 norm of a matrix, and the set of quantum correlations; i.e. conditional probability distributions achievable with local measurements on a shared entangled state. In both cases no-go theorems were previously known based on computational assumptions such as the Exponential Time Hypothesis (ETH) which asserts that 3-SAT requires exponential time to solve. Our unconditional results achieve the same parameters as all of these previous results (for separable states) or as some of the previous results (for quantum correlations). In some cases we can make use of the framework of Lee-Raghavendra-Steurer (LRS) to establish integrality gaps for any SDP, not only the SoS hierarchy. Our hardness result on separable states also yields a dimension lower bound of approximate disentanglers, answering a question of Watrous and Aaronson et al. These results can be viewed as limitations on the monogamy principle, the PPT test, the ability of Tsirelson-type bounds to restrict quantum correlations, as well as the SDP hierarchies of Doherty-Parrilo-Spedalieri, Navascues-Pironio-Acin and Berta-Fawzi-Scholz.Comment: 47 pages. v2. small changes, fixes and clarifications. published versio
    corecore