229 research outputs found

    An overview of population-based algorithms for multi-objective optimisation

    Get PDF
    In this work we present an overview of the most prominent population-based algorithms and the methodologies used to extend them to multiple objective problems. Although not exact in the mathematical sense, it has long been recognised that population-based multi-objective optimisation techniques for real-world applications are immensely valuable and versatile. These techniques are usually employed when exact optimisation methods are not easily applicable or simply when, due to sheer complexity, such techniques could potentially be very costly. Another advantage is that since a population of decision vectors is considered in each generation these algorithms are implicitly parallelisable and can generate an approximation of the entire Pareto front at each iteration. A critique of their capabilities is also provided

    A novel clustering methodology based on modularity optimisation for detecting authorship affinities in Shakespearean era plays

    Full text link
    © 2016 Naeni et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays

    Resource allocation technique for powerline network using a modified shuffled frog-leaping algorithm

    Get PDF
    Resource allocation (RA) techniques should be made efficient and optimized in order to enhance the QoS (power & bit, capacity, scalability) of high-speed networking data applications. This research attempts to further increase the efficiency towards near-optimal performance. RA’s problem involves assignment of subcarriers, power and bit amounts for each user efficiently. Several studies conducted by the Federal Communication Commission have proven that conventional RA approaches are becoming insufficient for rapid demand in networking resulted in spectrum underutilization, low capacity and convergence, also low performance of bit error rate, delay of channel feedback, weak scalability as well as computational complexity make real-time solutions intractable. Mainly due to sophisticated, restrictive constraints, multi-objectives, unfairness, channel noise, also unrealistic when assume perfect channel state is available. The main goal of this work is to develop a conceptual framework and mathematical model for resource allocation using Shuffled Frog-Leap Algorithm (SFLA). Thus, a modified SFLA is introduced and integrated in Orthogonal Frequency Division Multiplexing (OFDM) system. Then SFLA generated random population of solutions (power, bit), the fitness of each solution is calculated and improved for each subcarrier and user. The solution is numerically validated and verified by simulation-based powerline channel. The system performance was compared to similar research works in terms of the system’s capacity, scalability, allocated rate/power, and convergence. The resources allocated are constantly optimized and the capacity obtained is constantly higher as compared to Root-finding, Linear, and Hybrid evolutionary algorithms. The proposed algorithm managed to offer fastest convergence given that the number of iterations required to get to the 0.001% error of the global optimum is 75 compared to 92 in the conventional techniques. Finally, joint allocation models for selection of optima resource values are introduced; adaptive power and bit allocators in OFDM system-based Powerline and using modified SFLA-based TLBO and PSO are propose

    A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications

    Get PDF
    Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms

    Satisfying flexible due dates in fuzzy job shop by means of hybrid evolutionary algorithms

    Get PDF
    This paper tackles the job shop scheduling problem with fuzzy sets modelling uncertain durations and flexible due dates. The objective is to achieve high-service level by maximising due-date satisfaction, considering two different overall satisfaction measures as objective functions. We show how these functions model different attitudes in the framework of fuzzy multicriteria decision making and we define a measure of solution robustness based on an existing a-posteriori semantics of fuzzy schedules to further assess the quality of the obtained solutions. As solving method, we improve a memetic algorithm from the literature by incorporating a new heuristic mechanism to guide the search through plateaus of the fitness landscape. We assess the performance of the resulting algorithm with an extensive experimental study, including a parametric analysis, and a study of the algorithm’s components and synergy between them. We provide results on a set of existing and new benchmark instances for fuzzy job shop with flexible due dates that show the competitiveness of our method.This research has been supported by the Spanish Government under research grant TIN2016-79190-R

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    Preventing premature convergence and proving the optimality in evolutionary algorithms

    Get PDF
    http://ea2013.inria.fr//proceedings.pdfInternational audienceEvolutionary Algorithms (EA) usually carry out an efficient exploration of the search-space, but get often trapped in local minima and do not prove the optimality of the solution. Interval-based techniques, on the other hand, yield a numerical proof of optimality of the solution. However, they may fail to converge within a reasonable time due to their inability to quickly compute a good approximation of the global minimum and their exponential complexity. The contribution of this paper is a hybrid algorithm called Charibde in which a particular EA, Differential Evolution, cooperates with a Branch and Bound algorithm endowed with interval propagation techniques. It prevents premature convergence toward local optima and outperforms both deterministic and stochastic existing approaches. We demonstrate its efficiency on a benchmark of highly multimodal problems, for which we provide previously unknown global minima and certification of optimality
    • …
    corecore