646 research outputs found

    Evolutionary Computation in High Energy Physics

    Get PDF
    Evolutionary Computation is a branch of computer science with which, traditionally, High Energy Physics has fewer connections. Its methods were investigated in this field, mainly for data analysis tasks. These methods and studies are, however, less known in the high energy physics community and this motivated us to prepare this lecture. The lecture presents a general overview of the main types of algorithms based on Evolutionary Computation, as well as a review of their applications in High Energy Physics.Comment: Lecture presented at 2006 Inverted CERN School of Computing; to be published in the school proceedings (CERN Yellow Report

    Genetic algorithm dynamics on a rugged landscape

    Full text link
    The genetic algorithm is an optimization procedure motivated by biological evolution and is successfully applied to optimization problems in different areas. A statistical mechanics model for its dynamics is proposed based on the parent-child fitness correlation of the genetic operators, making it applicable to general fitness landscapes. It is compared to a recent model based on a maximum entropy ansatz. Finally it is applied to modeling the dynamics of a genetic algorithm on the rugged fitness landscape of the NK model.Comment: 10 pages RevTeX, 4 figures PostScrip

    A Study in function optimization with the breeder genetic algorithm

    Get PDF
    Optimization is concerned with the finding of global optima (hence the name) of problems that can be cast in the form of a function of several variables and constraints thereof. Among the searching methods, {em Evolutionary Algorithms} have been shown to be adaptable and general tools that have often outperformed traditional {em ad hoc} methods. The {em Breeder Genetic Algorithm} (BGA) combines a direct representation with a nice conceptual simplicity. This work contains a general description of the algorithm and a detailed study on a collection of function optimization tasks. The results show that the BGA is a powerful and reliable searching algorithm. The main discussion concerns the choice of genetic operators and their parameters, among which the family of Extended Intermediate Recombination (EIR) is shown to stand out. In addition, a simple method to dynamically adjust the operator is outlined and found to greatly improve on the already excellent overall performance of the algorithm.Postprint (published version

    Annealing schedule from population dynamics

    Full text link
    We introduce a dynamical annealing schedule for population-based optimization algorithms with mutation. On the basis of a statistical mechanics formulation of the population dynamics, the mutation rate adapts to a value maximizing expected rewards at each time step. Thereby, the mutation rate is eliminated as a free parameter from the algorithm.Comment: 6 pages RevTeX, 4 figures PostScript; to be published in Phys. Rev.

    Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings

    Full text link
    While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm. These parameters are known to have a crucial influence on the optimization time, and thus need to be chosen carefully, a task that often requires substantial efforts. Moreover, the optimal parameters can change during the optimization process. It is therefore of great interest to design mechanisms that dynamically choose best-possible parameters. An example for such an update mechanism is the one-fifth success rule for step-size adaption in evolutionary strategies. While in continuous domains this principle is well understood also from a mathematical point of view, no comparable theory is available for problems in discrete domains. In this work we show that the one-fifth success rule can be effective also in discrete settings. We regard the (1+(λ,λ))(1+(\lambda,\lambda))~GA proposed in [Doerr/Doerr/Ebel: From black-box complexity to designing new genetic algorithms, TCS 2015]. We prove that if its population size is chosen according to the one-fifth success rule then the expected optimization time on \textsc{OneMax} is linear. This is better than what \emph{any} static population size λ\lambda can achieve and is asymptotically optimal also among all adaptive parameter choices.Comment: This is the full version of a paper that is to appear at GECCO 201

    Integrated Design of Superconducting Magnets with the CERN Field Computation Program ROXIE

    Get PDF
    The program package ROXIE has been developed at CERN for the field computation of superconducting accelerator magnets and is used as an approach towards the integrated design of such magnets. It is also an example of fruitful international collaborations in software development.The integrated design of magnets includes feature based geometry generation, conceptual design using genetic optimization algorithms, optimization of the iron yoke (both in 2d and 3d) using deterministic methods, end-spacer design and inverse field calculation.The paper describes the version 8.0 of ROXIE which comprises an automatic mesh generator, an hysteresis model for the magnetization in superconducting filaments, the BEM-FEM coupling method for the 3d field calculation, a routine for the calculation of the peak temperature during a quench and neural network approximations of the objective function for the speed-up of optimization algorithms, amongst others.New results of the magnet design work for the LHC are given as examples
    • …
    corecore