3 research outputs found

    GNBG: A Generalized and Configurable Benchmark Generator for Continuous Numerical Optimization

    Full text link
    As optimization challenges continue to evolve, so too must our tools and understanding. To effectively assess, validate, and compare optimization algorithms, it is crucial to use a benchmark test suite that encompasses a diverse range of problem instances with various characteristics. Traditional benchmark suites often consist of numerous fixed test functions, making it challenging to align these with specific research objectives, such as the systematic evaluation of algorithms under controllable conditions. This paper introduces the Generalized Numerical Benchmark Generator (GNBG) for single-objective, box-constrained, continuous numerical optimization. Unlike existing approaches that rely on multiple baseline functions and transformations, GNBG utilizes a single, parametric, and configurable baseline function. This design allows for control over various problem characteristics. Researchers using GNBG can generate instances that cover a broad array of morphological features, from unimodal to highly multimodal functions, various local optima patterns, and symmetric to highly asymmetric structures. The generated problems can also vary in separability, variable interaction structures, dimensionality, conditioning, and basin shapes. These customizable features enable the systematic evaluation and comparison of optimization algorithms, allowing researchers to probe their strengths and weaknesses under diverse and controllable conditions

    A robust Gauss-Newton algorithm for the optimization of hydrological models: benchmarking against industry-standard algorithms

    Get PDF
    Optimization of model parameters is a ubiquitous task in hydrological and environmental modeling. Currently, the environmental modeling community tends to favor evolutionary techniques over classical Newton‐type methods, in the light of the geometrically problematic features of objective functions, such as multiple optima and general nonsmoothness. The companion paper (Qin et al., 2018, https://doi.org/10.1029/2017WR022488) introduced the robust Gauss‐Newton (RGN) algorithm, an enhanced version of the standard Gauss‐Newton algorithm that employs several heuristics to enhance its explorative abilities and perform robustly even for problematic objective functions. This paper focuses on benchmarking the RGN algorithm against three optimization algorithms generally accepted as “best practice” in the hydrological community, namely, the Levenberg‐Marquardt algorithm, the shuffled complex evolution (SCE) search (with 2 and 10 complexes), and the dynamically dimensioned search (DDS). The empirical case studies include four conceptual hydrological models and three catchments. Empirical results indicate that, on average, RGN is 2–3 times more efficient than SCE (2 complexes) by achieving comparable robustness at a lower cost, 7–9 times more efficient than SCE (10 complexes) by trading off some speed to more than compensate for a somewhat lower robustness, 5–7 times more efficient than Levenberg‐Marquardt by achieving higher robustness at a moderate additional cost, and 12–26 times more efficient than DDS in terms of robustness‐per‐fixed‐cost. A detailed analysis of performance in terms of reliability and cost is provided. Overall, the RGN algorithm is an attractive option for the calibration of hydrological models, and we recommend further investigation of its benefits for broader types of optimization problems.Youwei Qin, Dmitri Kavetski, George Kuczer

    Global optimization using q-gradients

    No full text
    Made available in DSpace on 2019-09-12T16:53:35Z (GMT). No. of bitstreams: 0 Previous issue date: 2016Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)The q-gradient vector is a generalization of the gradient vector based on the q-derivative. We present two global optimization methods that do not require ordinary derivatives: a q-analog of the Steepest Descent method called the q-G method and a q-analog of the Conjugate Gradient method called the q-CG method. Both q-G and q-CG are reduced to their classical versions when q equals 1. These methods are implemented in such a way that the search process gradually shifts from global in the beginning to almost local search in the end. Moreover, Gaussian perturbations are used in some iterations to guarantee the convergence of the methods to the global minimum in a probabilistic sense. We compare q-G and q-CG with their classical versions and with other methods, including CMA-ES, a variant of Controlled Random Search, and an interior point method that uses finite-difference derivatives, on 27 well-known test problems. In general, the q-G and q-CG methods are very promising and competitive, especially when applied to multimodal problems. (C) 2016 Elsevier B.V. All rights reserved.[Gouvea, Erica J. C.; Soterroni, Aline C.; Scarabello, Marluce C.; Ramos, Fernando M.] Natl Inst Space Res INPE, Lab Comp & Appl Math, Sao Jose Dos Campos, SP, Brazil[Gouvea, Erica J. C.] Universidade de Taubaté (Unitau), Exact Sci Inst[Regis, Rommel G.] St Josephs Univ, Dept Math, Philadelphia, PA 19131 US
    corecore