1,430 research outputs found

    Lower Bounds for the Average and Smoothed Number of Pareto Optima

    Get PDF
    Smoothed analysis of multiobjective 0-1 linear optimization has drawn considerable attention recently. The number of Pareto-optimal solutions (i.e., solutions with the property that no other solution is at least as good in all the coordinates and better in at least one) for multiobjective optimization problems is the central object of study. In this paper, we prove several lower bounds for the expected number of Pareto optima. Our basic result is a lower bound of \Omega_d(n^(d-1)) for optimization problems with d objectives and n variables under fairly general conditions on the distributions of the linear objectives. Our proof relates the problem of lower bounding the number of Pareto optima to results in geometry connected to arrangements of hyperplanes. We use our basic result to derive (1) To our knowledge, the first lower bound for natural multiobjective optimization problems. We illustrate this for the maximum spanning tree problem with randomly chosen edge weights. Our technique is sufficiently flexible to yield such lower bounds for other standard objective functions studied in this setting (such as, multiobjective shortest path, TSP tour, matching). (2) Smoothed lower bound of min {\Omega_d(n^(d-1.5) \phi^{(d-log d) (1-\Theta(1/\phi))}), 2^{\Theta(n)}}$ for the 0-1 knapsack problem with d profits for phi-semirandom distributions for a version of the knapsack problem. This improves the recent lower bound of Brunsch and Roeglin

    Multivariate Convex Approximation and Least-Norm Convex Data-Smoothing

    Get PDF
    The main contents of this paper is two-fold.First, we present a method to approximate multivariate convex functions by piecewise linear upper and lower bounds.We consider a method that is based on function evaluations only.However, to use this method, the data have to be convex.Unfortunately, even if the underlying function is convex, this is not always the case due to (numerical) errors.Therefore, secondly, we present a multivariate data-smoothing method that smooths nonconvex data.We consider both the case that we have only function evaluations and the case that we also have derivative information.Furthermore, we show that our methods are polynomial time methods.We illustrate this methodology by applying it to some examples.approximation theory;convexity;data-smoothing

    Towards the Evolution of Novel Vertical-Axis Wind Turbines

    Full text link
    Renewable and sustainable energy is one of the most important challenges currently facing mankind. Wind has made an increasing contribution to the world's energy supply mix, but still remains a long way from reaching its full potential. In this paper, we investigate the use of artificial evolution to design vertical-axis wind turbine prototypes that are physically instantiated and evaluated under approximated wind tunnel conditions. An artificial neural network is used as a surrogate model to assist learning and found to reduce the number of fabrications required to reach a higher aerodynamic efficiency, resulting in an important cost reduction. Unlike in other approaches, such as computational fluid dynamics simulations, no mathematical formulations are used and no model assumptions are made.Comment: 14 pages, 11 figure

    Experimental identification and optimization of the concrete block vibropressing process

    Get PDF
    Material testing machine Instron 8802 was used for experiments with raw concrete vibropressing. The experiments were conducted according to the Mean Square Error Latin hypercube design. Influence of pressing force, force amplitude and frequency on the pressing process was investigated. The registered displacement and force curves were smoothed and approximated with 1- 3 parameter functions. The dependence of these parameters on the pressing force constant component, force oscillation amplitude and frequency was determined by means of nonparametric kriging approximations. The approximations were validated with additional physical experiments. The estimated relative prediction error was 15%. Developed approximated models were applied for multiobjective optimization of the vibropressing process. Optimization criteria were: compacting rate, consumed energy, pressing cycle length. Pareto frontier surfaces were constructed and analyze

    Towards explaining the speed of kk-means

    Get PDF
    The kk-means method is a popular algorithm for clustering, known for its speed in practice. This stands in contrast to its exponential worst-case running-time. To explain the speed of the kk-means method, a smoothed analysis has been conducted. We sketch this smoothed analysis and a generalization to Bregman divergences
    • …
    corecore