2,045 research outputs found

    Negatively Correlated Search

    Full text link
    Evolutionary Algorithms (EAs) have been shown to be powerful tools for complex optimization problems, which are ubiquitous in both communication and big data analytics. This paper presents a new EA, namely Negatively Correlated Search (NCS), which maintains multiple individual search processes in parallel and models the search behaviors of individual search processes as probability distributions. NCS explicitly promotes negatively correlated search behaviors by encouraging differences among the probability distributions (search behaviors). By this means, individual search processes share information and cooperate with each other to search diverse regions of a search space, which makes NCS a promising method for non-convex optimization. The cooperation scheme of NCS could also be regarded as a novel diversity preservation scheme that, different from other existing schemes, directly promotes diversity at the level of search behaviors rather than merely trying to maintain diversity among candidate solutions. Empirical studies showed that NCS is competitive to well-established search methods in the sense that NCS achieved the best overall performance on 20 multimodal (non-convex) continuous optimization problems. The advantages of NCS over state-of-the-art approaches are also demonstrated with a case study on the synthesis of unequally spaced linear antenna arrays

    Perfectionism Search Algorithm (PSA): An Efficient Meta-Heuristic Optimization Approach

    Full text link
    This paper proposes a novel population-based meta-heuristic optimization algorithm, called Perfectionism Search Algorithm (PSA), which is based on the psychological aspects of perfectionism. The PSA algorithm takes inspiration from one of the most popular model of perfectionism, which was proposed by Hewitt and Flett. During each iteration of the PSA algorithm, new solutions are generated by mimicking different types and aspects of perfectionistic behavior. In order to have a complete perspective on the performance of PSA, the proposed algorithm is tested with various nonlinear optimization problems, through selection of 35 benchmark functions from the literature. The generated solutions for these problems, were also compared with 11 well-known meta-heuristics which had been applied to many complex and practical engineering optimization problems. The obtained results confirm the high performance of the proposed algorithm in comparison to the other well-known algorithms

    Improving the Diversity of PSO for an Engineering Inverse Problem using Adaptive Inertia Weight

    Get PDF
    Particle swarm optimization is a stochastic optimal search algorithm inspired by observing schools of fishes and flocks of birds. It is prevalent due to its easy implementation and fast convergence. However, PSO has been known to succumb to local optima when dealing with complex and higher dimensional optimization problems. To handle the problem of premutature convergence in PSO, this paper presents a novel adaptive inertia weight strategy and modifies the velocity update equation with the new Sbest term. To maintain the diversity of the population a particular radius r is introduced to impulse cluster particles. To validate the effectiveness of the proposed algorithm, various test functions and typical engineering applications are employed, and the experimental results show that with the changing of the proposed parameter the performance of PSO improves when dealing with these complex and high dimensional problems

    Free Search Towards Multidimensional Optimisation Problems

    Get PDF
    The article presents experimental results achieved from a novel heuristic algorithm for real-value search and optimisation called Free Search (FS). The aim is to clarify the abilities of this method to return optimal solutions from multidimensional search spaces currently resistant to other search techniques

    A hybrid swarm-based algorithm for single-objective optimization problems involving high-cost analyses

    Full text link
    In many technical fields, single-objective optimization procedures in continuous domains involve expensive numerical simulations. In this context, an improvement of the Artificial Bee Colony (ABC) algorithm, called the Artificial super-Bee enhanced Colony (AsBeC), is presented. AsBeC is designed to provide fast convergence speed, high solution accuracy and robust performance over a wide range of problems. It implements enhancements of the ABC structure and hybridizations with interpolation strategies. The latter are inspired by the quadratic trust region approach for local investigation and by an efficient global optimizer for separable problems. Each modification and their combined effects are studied with appropriate metrics on a numerical benchmark, which is also used for comparing AsBeC with some effective ABC variants and other derivative-free algorithms. In addition, the presented algorithm is validated on two recent benchmarks adopted for competitions in international conferences. Results show remarkable competitiveness and robustness for AsBeC.Comment: 19 pages, 4 figures, Springer Swarm Intelligenc
    • …
    corecore