470 research outputs found

    Lower Bounds for the Average and Smoothed Number of Pareto Optima

    Get PDF
    Smoothed analysis of multiobjective 0-1 linear optimization has drawn considerable attention recently. The number of Pareto-optimal solutions (i.e., solutions with the property that no other solution is at least as good in all the coordinates and better in at least one) for multiobjective optimization problems is the central object of study. In this paper, we prove several lower bounds for the expected number of Pareto optima. Our basic result is a lower bound of \Omega_d(n^(d-1)) for optimization problems with d objectives and n variables under fairly general conditions on the distributions of the linear objectives. Our proof relates the problem of lower bounding the number of Pareto optima to results in geometry connected to arrangements of hyperplanes. We use our basic result to derive (1) To our knowledge, the first lower bound for natural multiobjective optimization problems. We illustrate this for the maximum spanning tree problem with randomly chosen edge weights. Our technique is sufficiently flexible to yield such lower bounds for other standard objective functions studied in this setting (such as, multiobjective shortest path, TSP tour, matching). (2) Smoothed lower bound of min {\Omega_d(n^(d-1.5) \phi^{(d-log d) (1-\Theta(1/\phi))}), 2^{\Theta(n)}}$ for the 0-1 knapsack problem with d profits for phi-semirandom distributions for a version of the knapsack problem. This improves the recent lower bound of Brunsch and Roeglin

    Smoothed Analysis of Selected Optimization Problems and Algorithms

    Get PDF
    Optimization problems arise in almost every field of economics, engineering, and science. Many of these problems are well-understood in theory and sophisticated algorithms exist to solve them efficiently in practice. Unfortunately, in many cases the theoretically most efficient algorithms perform poorly in practice. On the other hand, some algorithms are much faster than theory predicts. This discrepancy is a consequence of the pessimism inherent in the framework of worst-case analysis, the predominant analysis concept in theoretical computer science. We study selected optimization problems and algorithms in the framework of smoothed analysis in order to narrow the gap between theory and practice. In smoothed analysis, an adversary specifies the input, which is subsequently slightly perturbed at random. As one example we consider the successive shortest path algorithm for the minimumcost flow problem. While in the worst case the successive shortest path algorithm takes exponentially many steps to compute a minimum-cost flow, we show that its running time is polynomial in the smoothed setting. Another problem studied in this thesis is makespan minimization for scheduling with related machines. It seems to be unlikely that there exist fast algorithms to solve this problem exactly. This is why we consider three approximation algorithms: the jump algorithm, the lex-jump algorithm, and the list scheduling algorithm. In the worst case, the approximation guarantees of these algorithms depend on the number of machines. We show that there is no such dependence in smoothed analysis. We also apply smoothed analysis to multicriteria optimization problems. In particular, we consider integer optimization problems with several linear objectives that have to be simultaneously minimized. We derive a polynomial upper bound for the size of the set of Pareto-optimal solutions contrasting the exponential worst-case lower bound. As the icing on the cake we find that the insights gained from our smoothed analysis of the running time of the successive shortest path algorithm lead to the design of a randomized algorithm for finding short paths between two given vertices of a polyhedron. We see this result as an indication that, in future, smoothed analysis might also result in the development of fast algorithms.Optimierungsprobleme treten in allen wirtschaftlichen, naturwissenschaftlichen und technischen Gebieten auf. Viele dieser Probleme sind ausführlich untersucht und aus praktischer Sicht effizient lösbar. Leider erweisen sich in vielen Fällen die theoretisch effizientesten Algorithmen in der Praxis als ungeeignet. Auf der anderen Seite sind einige Algorithmen viel schneller als die Theorie vorhersagt. Dieser scheinbare Widerspruch resultiert aus dem Pessimismus, der dem in der theoretischen Informatik vorherrschenden Analysekonzept, der Worst-Case-Analyse, innewohnt. Um die Lücke zwischen Theorie und Praxis zu verkleinern, untersuchen wir ausgewählte Optimierungsprobleme und Algorithmen auf gegnerisch vorgegebenen Instanzen, die durch ein leichtes Zufallsrauschen gestört werden. Solche perturbierten Instanzen bezeichnen wir als semi-zufällige Eingaben. Als Beispiel betrachten wir den Successive- Shortest-Path-Algorithmus für das Minimum-Cost-Flow-Problem. Während dieser Algorithmus imWorst Case exponentiell viele Schritte benötigt, um einen Minimum-Cost-Flow zu berechnen, zeigen wir, dass seine Laufzeit auf semi-zufälligen Eingaben polynomiell ist. Ein weiteres Problem, das wir in dieser Arbeit untersuchen, ist die Minimierung des Makespans für Scheduling auf unterschiedlich schnellen Maschinen. Es scheint, dass dieses Problem nicht effizient gelöst werden kann. Daher betrachten wir drei Approximationsalgorithmen: den Jump-, den Lex-Jump- und den List-Scheduling-Algorithmus. Im Worst Case hängt die Approximationsgüte dieser Algorithmen von der Anzahl der Maschinen ab. Wir zeigen, dass das auf semi-zufälligen Eingaben nicht der Fall ist. Des Weiteren betrachten wir ganzzahlige Optimierungsprobleme mit mehreren linearen Zielfunktionen, die simultan minimiert werden sollen. Wir leiten eine polynomielle obere Schranke für die Größe der Pareto-Menge auf semi-zufälligen Eingaben her, die im Gegensatz zu der exponentiellen unteren Worst-Case-Schranke steht. Mit den Erkenntnissen aus der Laufzeitanalyse des Successive-Shortest-Path-Algorithmus entwerfen wir einen randomisierten Algorithmus zur Bestimmung eines kurzen Pfades zwischen zwei gegebenen Ecken eines Polyeders. Wir betrachten dieses Ergebnis als ein Indiz dafür, dass in Zukunft Analysen auf semi-zufälligen Eingaben auch zu der Entwicklung schneller Algorithmen führen könnten

    The smoothed number of {P}areto-optimal solutions in bicriteria integer optimization

    Get PDF

    Optimization of dispersive coefficients in the homogenization of the wave equation in periodic structures

    No full text
    International audienceWe study dispersive effects of wave propagation in periodic media, which can be modelled by adding a fourth-order term in the homogenized equation. The corresponding fourth-order dispersive tensor is called Burnett tensor and we numerically optimize its values in order to minimize or maximize dispersion. More precisely, we consider the case of a two-phase composite medium with an 8-fold symmetry assumption of the periodicity cell in two space dimensions. We obtain upper and lower bound for the dispersive properties, along with optimal microgeometries

    Dynamically adaptive networks for integrating optimal pressure management and self-cleaning controls

    Full text link
    This paper investigates the problem of integrating optimal pressure management and self-cleaning controls in dynamically adaptive water distribution networks. We review existing single-objective valve placement and control problems for minimizing average zone pressure (AZP) and maximizing self-cleaning capacity (SCC). Since AZP and SCC are conflicting objectives, we formulate a bi-objective design-for-control problem where locations and operational settings of pressure control and automatic flushing valves are jointly optimized. We approximate Pareto fronts using the weighted sum scalarization method, which uses a previously developed convex heuristic to solve the sequence of parametrized single-objective problems. The resulting Pareto fronts suggest that significant improvements in SCC can be achieved for minimal trade-offs in AZP performance. Moreover, we demonstrate that a hierarchical design strategy is capable of yielding good quality solutions to both objectives. This hierarchical design considers pressure control valves first placed for the primary AZP objective, followed by automatic flushing valves placed to augment SCC conditions. In addition, we investigate an adaptive control scheme for dynamically transitioning between AZP and SCC controls. We demonstrate these control challenges on case networks with both interconnected and branched topology.Comment: 26 pages, 7 figures, published paper in Annual Reviews in Contro
    • …
    corecore