31,377 research outputs found

    COCO: Performance Assessment

    Full text link
    We present an any-time performance assessment for benchmarking numerical optimization algorithms in a black-box scenario, applied within the COCO benchmarking platform. The performance assessment is based on runtimes measured in number of objective function evaluations to reach one or several quality indicator target values. We argue that runtime is the only available measure with a generic, meaningful, and quantitative interpretation. We discuss the choice of the target values, runlength-based targets, and the aggregation of results by using simulated restarts, averages, and empirical distribution functions

    A new code for orbit analysis and Schwarzschild modelling of triaxial stellar systems

    Full text link
    We review the methods used to study the orbital structure and chaotic properties of various galactic models and to construct self-consistent equilibrium solutions by Schwarzschild's orbit superposition technique. These methods are implemented in a new publicly available software tool, SMILE, which is intended to be a convenient and interactive instrument for studying a variety of 2D and 3D models, including arbitrary potentials represented by a basis-set expansion, a spherical-harmonic expansion with coefficients being smooth functions of radius (splines), or a set of fixed point masses. We also propose two new variants of Schwarzschild modelling, in which the density of each orbit is represented by the coefficients of the basis-set or spline spherical-harmonic expansion, and the orbit weights are assigned in such a way as to reproduce the coefficients of the underlying density model. We explore the accuracy of these general-purpose potential expansions and show that they may be efficiently used to approximate a wide range of analytic density models and serve as smooth representations of discrete particle sets (e.g. snapshots from an N-body simulation), for instance, for the purpose of orbit analysis of the snapshot. For the variants of Schwarzschild modelling, we use two test cases - a triaxial Dehnen model containing a central black hole, and a model re-created from an N-body snapshot obtained by a cold collapse. These tests demonstrate that all modelling approaches are capable of creating equilibrium models.Comment: MNRAS, 24 pages, 18 figures. Software is available at http://td.lpi.ru/~eugvas/smile

    Indicator-based evolutionary level set approximation: mixed mutation strategy and extended analysis.

    Get PDF
    The aim of evolutionary level set approximation is to find a finite representation of a level set of a given black box function. The problem of level set approximation plays a vital role in solving problems, for instance in fault detection in water distribution systems, engineering design, parameter identification in gene regulatory networks, and in drug discovery. The goal is to create algorithms that quickly converge to feasible solutions and then achieve a good coverage of the level set. The population based search scheme of evolutionary algorithms makes this type of algorithms well suited to target such problems. In this paper, the focus is on continuous black box functions and we propose a challenging benchmark for this problem domain and propose dual mutation strategies, that balance between global exploration and local refinement. Moreover, the article investigates the role of different indicators for measuring the coverage of the level set approximation. The results are promising and show that even for difficult problems in moderate dimension the proposed evolutionary level set approximation algorithm (ELSA) can serve as a versatile and robust meta-heuristic

    Correcting the radar rainfall forcing of a hydrological model with data assimilation: application to flood forecasting in the Lez Catchment in Southern France

    Get PDF
    The present study explores the application of a data assimilation (DA) procedure to correct the radar rain- fall inputs of an event-based, distributed, parsimonious hy- drological model. An extended Kalman filter algorithm was built on top of a rainfall-runoff model in order to assimilate discharge observations at the catchment outlet. This work fo- cuses primarily on the uncertainty in the rainfall data and considers this as the principal source of error in the sim- ulated discharges, neglecting simplifications in the hydro- logical model structure and poor knowledge of catchment physics. The study site is the 114 km2 Lez catchment near Montpellier, France. This catchment is subject to heavy oro- graphic rainfall and characterised by a karstic geology, lead- ing to flash flooding events. The hydrological model uses a derived version of the SCS method, combined with a Lag and Route transfer function. Because the radar rainfall in- put to the model depends on geographical features and cloud structures, it is particularly uncertain and results in signifi- cant errors in the simulated discharges. This study seeks to demonstrate that a simple DA algorithm is capable of ren- dering radar rainfall suitable for hydrological forecasting. To test this hypothesis, the DA analysis was applied to estimate a constant hyetograph correction to each of 19 flood events. The analysis was carried in two different modes: by assimi- lating observations at all available time steps, referred to here as reanalysis mode, and by using only observations up to 3 h before the flood peak to mimic an operational environment, referred to as pseudo-forecast mode. In reanalysis mode, the resulting correction of the radar rainfall data was then com- pared to the mean field bias (MFB), a corrective coefficient determined using rain gauge measurements. It was shown that the radar rainfall corrected using DA leads to improved discharge simulations and Nash-Sutcliffe efficiency criteria compared to the MFB correction. In pseudo-forecast mode, the reduction of the uncertainty in the rainfall data leads to a reduction of the error in the simulated discharge, but un- certainty from the model parameterisation diminishes data assimilation efficiency. While the DA algorithm used is this study is effective in correcting uncertain radar rainfall, model uncertainty remains an important challenge for flood fore- casting within the Lez catchment
    corecore