448,997 research outputs found

    Fast Genetic Algorithms

    Full text link
    For genetic algorithms using a bit-string representation of length~nn, the general recommendation is to take 1/n1/n as mutation rate. In this work, we discuss whether this is really justified for multimodal functions. Taking jump functions and the (1+1)(1+1) evolutionary algorithm as the simplest example, we observe that larger mutation rates give significantly better runtimes. For the \jump_{m,n} function, any mutation rate between 2/n2/n and m/nm/n leads to a speed-up at least exponential in mm compared to the standard choice. The asymptotically best runtime, obtained from using the mutation rate m/nm/n and leading to a speed-up super-exponential in mm, is very sensitive to small changes of the mutation rate. Any deviation by a small (1 \pm \eps) factor leads to a slow-down exponential in mm. Consequently, any fixed mutation rate gives strongly sub-optimal results for most jump functions. Building on this observation, we propose to use a random mutation rate α/n\alpha/n, where α\alpha is chosen from a power-law distribution. We prove that the (1+1)(1+1) EA with this heavy-tailed mutation rate optimizes any \jump_{m,n} function in a time that is only a small polynomial (in~mm) factor above the one stemming from the optimal rate for this mm. Our heavy-tailed mutation operator yields similar speed-ups (over the best known performance guarantees) for the vertex cover problem in bipartite graphs and the matching problem in general graphs. Following the example of fast simulated annealing, fast evolution strategies, and fast evolutionary programming, we propose to call genetic algorithms using a heavy-tailed mutation operator \emph{fast genetic algorithms}

    Genetic algorithms: a pragmatic, non-parametric approach to exploratory analysis of questionnaires in educational research

    Get PDF
    Data from a survey to determine student attitudes to their courses are used as an example to show how genetic algorithms can be used in the analysis of questionnaire data. Genetic algorithms provide a means of generating logical rules which predict one variable in a data set by relating it to others. This paper explains the principle underlying genetic algorithms and gives a non-mathematical description of the means by which rules are generated. A commercially available computer program is used to apply genetic algorithms to the survey data. The results are discussed

    Incremental multiple objective genetic algorithms

    Get PDF
    This paper presents a new genetic algorithm approach to multi-objective optimization problemsIncremental Multiple Objective Genetic Algorithms (IMOGA). Different from conventional MOGA methods, it takes each objective into consideration incrementally. The whole evolution is divided into as many phases as the number of objectives, and one more objective is considered in each phase. Each phase is composed of two stages: first, an independent population is evolved to optimize one specific objective; second, the better-performing individuals from the evolved single-objective population and the multi-objective population evolved in the last phase are joined together by the operation of integration. The resulting population then becomes an initial multi-objective population, to which a multi-objective evolution based on the incremented objective set is applied. The experiment results show that, in most problems, the performance of IMOGA is better than that of three other MOGAs, NSGA-II, SPEA and PAES. IMOGA can find more solutions during the same time span, and the quality of solutions is better

    Memory-based immigrants for genetic algorithms in dynamic environments

    Get PDF
    Copyright @ 2005 ACMInvestigating and enhancing the performance of genetic algorithms in dynamic environments have attracted a growing interest from the community of genetic algorithms in recent years. This trend reflects the fact that many real world problems are actually dynamic, which poses serious challenge to traditional genetic algorithms. Several approaches have been developed into genetic algorithms for dynamic optimization problems. Among these approches, random immigrants and memory schemes have shown to be beneficial in many dynamic problems. This paper proposes a hybrid memory and random immigrants scheme for genetic algorithms in dynamic environments. In the hybrid scheme, the best solution in memory is retrieved and acts as the base to create random immigrants to replace the worst individuals in the population. In this way, not only can diversity be maintained but it is done more efficiently to adapt the genetic algorithm to the changing environment. The experimental results based on a series of systematically constructed dynamic problems show that the proposed memory based immigrants scheme efficiently improves the performance of genetic algorithms in dynamic environments

    A Novel Genetic Algorithm using Helper Objectives for the 0-1 Knapsack Problem

    Full text link
    The 0-1 knapsack problem is a well-known combinatorial optimisation problem. Approximation algorithms have been designed for solving it and they return provably good solutions within polynomial time. On the other hand, genetic algorithms are well suited for solving the knapsack problem and they find reasonably good solutions quickly. A naturally arising question is whether genetic algorithms are able to find solutions as good as approximation algorithms do. This paper presents a novel multi-objective optimisation genetic algorithm for solving the 0-1 knapsack problem. Experiment results show that the new algorithm outperforms its rivals, the greedy algorithm, mixed strategy genetic algorithm, and greedy algorithm + mixed strategy genetic algorithm

    Higher-Order Quantum-Inspired Genetic Algorithms

    Get PDF
    This paper presents a theory and an empirical evaluation of Higher-Order Quantum-Inspired Genetic Algorithms. Fundamental notions of the theory have been introduced, and a novel Order-2 Quantum-Inspired Genetic Algorithm (QIGA2) has been presented. Contrary to all QIGA algorithms which represent quantum genes as independent qubits, in higher-order QIGAs quantum registers are used to represent genes strings which allows modelling of genes relations using quantum phenomena. Performance comparison has been conducted on a benchmark of 20 deceptive combinatorial optimization problems. It has been presented that using higher quantum orders is beneficial for genetic algorithm efficiency, and the new QIGA2 algorithm outperforms the old QIGA algorithm which was tuned in highly compute intensive metaoptimization process

    Genetic algorithms for satellite scheduling problems

    Get PDF
    Recently there has been a growing interest in mission operations scheduling problem. The problem, in a variety of formulations, arises in management of satellite/space missions requiring efficient allocation of user requests to make possible the communication between operations teams and spacecraft systems. Not only large space agencies, such as ESA (European Space Agency) and NASA, but also smaller research institutions and universities can establish nowadays their satellite mission, and thus need intelligent systems to automate the allocation of ground station services to space missions. In this paper, we present some relevant formulations of the satellite scheduling viewed as a family of problems and identify various forms of optimization objectives. The main complexities, due highly constrained nature, windows accessibility and visibility, multi-objectives and conflicting objectives are examined. Then, we discuss the resolution of the problem through different heuristic methods. In particular, we focus on the version of ground station scheduling, for which we present computational results obtained with Genetic Algorithms using the STK simulation toolkit.Peer ReviewedPostprint (published version
    corecore