27 research outputs found

    Exploiting Gradient Information in Continuous Iterated Density Estimation Evolutionary Algorithms

    No full text
    For continuous optimization problems, evolutionary algorithms (EAs) that build and use probabilistic models have obtained promising results

    Bringing IDEAs into practice: Optimization in a Minimally Invasive Vascular Intervention Simulation System

    No full text
    For real–valued continuous optimization problems various evolutionary algorithms (EAs) have obtained promising results on benchmark problems. The Iterated Density–Estimation Evolutionary Algorithm (IDEA) is an example of one such algorithm. However, little is known about the practical benefits of these algorithms even though AI–techniques are often favored in practice because of their general applicability and good performance on complicated real–world problems. In this paper we focus on one specific practical medical application that imposes many optimization tasks. The application is the simulation of minimally invasive vascular interventions. We compare the use of a hybrid IDEA with the conjugate gradients algorithm and a problem–specific optimization algorithm and indicate that although the application of the conjugate gradients algorithm already leads to highly useful results, IDEAs yet improve on these results in the area of scalability, making a clear statement that IDEAs can indeed also be useful in practice.

    Multi-Objective Mixture-based Iterated Density Estimation Evolutionary Algorithms

    Get PDF
    We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (M IDE A). The M IDE A algorithm is a probabilistic model building evolutionary algorithm that constructs at each generation a mixture of factorized probability distributions

    New IDEAs and More ICE by Learning and Using Unconditional Permutation Factorizations

    No full text
    Solving permutation optimization problems is an important and open research question

    Advancing Continuous IDEAs with Mixture Distributions and Factorization Selection Metrics

    No full text
    Evolutionary optimization based on proba- bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be introduced in the factorization. In this paper, we advance these techniques by using clustering and the EM algorithm to allow for mixture distributions
    corecore