236 research outputs found

    A multi-start biased-randomized algorithm for the capacitated dispersion problem

    Get PDF
    The capacitated dispersion problem is a variant of the maximum diversity problem in which a set of elements in a network must be determined. These elements might represent, for instance, facilities in a logistics network or transmission devices in a telecommunication network. Usually, it is considered that each element is limited in its servicing capacity. Hence, given a set of possible locations, the capacitated dispersion problem consists of selecting a subset that maximizes the minimum distance between any pair of elements while reaching an aggregated servicing capacity. Since this servicing capacity is a highly usual constraint in real-world problems, the capacitated dispersion problem is often a more realistic approach than is the traditional maximum diversity problem. Given that the capacitated dispersion problem is an NP-hard problem, whenever large-sized instances are considered, we need to use heuristic-based algorithms to obtain high-quality solutions in reasonable computational times. Accordingly, this work proposes a multi-start biased-randomized algorithm to efficiently solve the capacitated dispersion problem. A series of computational experiments is conducted employing small-, medium-, and large-sized instances. Our results are compared with the best-known solutions reported in the literature, some of which have been proven to be optimal. Our proposed approach is proven to be highly competitive, as it achieves either optimal or near-optimal solutions and outperforms the non-optimal best-known solutions in many cases. Finally, a sensitive analysis considering different levels of the minimum aggregate capacity is performed as well to complete our study.Peer ReviewedPostprint (published version

    A Vitual-Force Based Swarm Algorithm for Balanced Circular Bin Packing Problems

    Full text link
    Balanced circular bin packing problems consist in positioning a given number of weighted circles in order to minimize the radius of a circular container while satisfying equilibrium constraints. These problems are NP-hard, highly constrained and dimensional. This paper describes a swarm algorithm based on a virtual-force system in order to solve balanced circular bin packing problems. In the proposed approach, a system of forces is applied to each component allowing to take into account the constraints and minimizing the objective function using the fundamental principle of dynamics. The proposed algorithm is experimented and validated on benchmarks of various balanced circular bin packing problems with up to 300 circles. The reported results allow to assess the effectiveness of the proposed approach compared to existing results from the literature.Comment: 23 pages including reference

    Traveling Salesman Problem

    Get PDF
    The idea behind TSP was conceived by Austrian mathematician Karl Menger in mid 1930s who invited the research community to consider a problem from the everyday life from a mathematical point of view. A traveling salesman has to visit exactly once each one of a list of m cities and then return to the home city. He knows the cost of traveling from any city i to any other city j. Thus, which is the tour of least possible cost the salesman can take? In this book the problem of finding algorithmic technique leading to good/optimal solutions for TSP (or for some other strictly related problems) is considered. TSP is a very attractive problem for the research community because it arises as a natural subproblem in many applications concerning the every day life. Indeed, each application, in which an optimal ordering of a number of items has to be chosen in a way that the total cost of a solution is determined by adding up the costs arising from two successively items, can be modelled as a TSP instance. Thus, studying TSP can never be considered as an abstract research with no real importance

    Traveling Salesman Problem

    Get PDF
    This book is a collection of current research in the application of evolutionary algorithms and other optimal algorithms to solving the TSP problem. It brings together researchers with applications in Artificial Immune Systems, Genetic Algorithms, Neural Networks and Differential Evolution Algorithm. Hybrid systems, like Fuzzy Maps, Chaotic Maps and Parallelized TSP are also presented. Most importantly, this book presents both theoretical as well as practical applications of TSP, which will be a vital tool for researchers and graduate entry students in the field of applied Mathematics, Computing Science and Engineering

    Particle Swarm Optimization

    Get PDF
    Particle swarm optimization (PSO) is a population based stochastic optimization technique influenced by the social behavior of bird flocking or fish schooling.PSO shares many similarities with evolutionary computation techniques such as Genetic Algorithms (GA). The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles. This book represents the contributions of the top researchers in this field and will serve as a valuable tool for professionals in this interdisciplinary field

    A Google-inspired error-correcting graph matching algorithm

    Get PDF
    Graphs and graph algorithms are applied in many different areas including civil engineering, telecommunications, bio-informatics and software engineering. While exact graph matching is grounded on a consolidated theory and has well known results, approximate graph matching is still an open research subject. This paper presents an error tolerant approximated graph matching algorithm based on tabu search using the Google-like PageRank algorithm. We report preliminary results obtained on 2 graph data benchmarks. The first one is the TC-15 database [14], a graph data base at the University of Naples, Italy. These graphs are limited to exact matching. The second one is a novel data set of large graphs generated by randomly mutating TC-15 graphs in order to evaluate the performance of our algorithm. Such a mutation approach allows us to gain insight not only about time but also about matching accuracy

    A unified metaheuristic and system-theoretic framework for petroleum reservoir management

    Get PDF
    With phenomenal rise in world population as well as robust economic growth in China, India and other emerging economies; the global demand for energy continues to grow in monumental proportions. Owing to its wide end-use capabilities, petroleum is without doubt, the world’s number one energy resource. The present demand for oil and credible future forecasts – which point to the fact that the demand is expected to increase in the coming decades – make it imperative that the E&P industry must device means to improve the present low recovery factor of hydrocarbon reservoirs. Efficiently tailored model-based optimization, estimation and control techniques within the ambit of a closed-loop reservoir management framework can play a significant role in achieving this objective. In this thesis, some fundamental reservoir engineering problems such as field development planning, production scheduling and control are formulated into different optimization problems. In this regard, field development optimization identifies the well placements that best maximizes hydrocarbon recovery, while production optimization identifies reservoir well-settings that maximizes total oil recovery or asset value, and finally, the implementation of a predictive controller algorithm which computes corrected well controls that minimizes the difference between actual outputs and simulated (or optimal) reference trajectory. We employ either deterministic or metaheuristic optimization algorithms, such that the choice of algorithm is purely based on the peculiarity of the underlying optimization problem. Altogether, we present a unified metaheuristic and system-theoretic framework for petroleum reservoir management. The proposed framework is essentially a closed-loop reservoir management approach with four key elements, namely: a new metaheuristic technique for field development optimization, a gradient-based adjoint formulation for well rates control, an effective predictive control strategy for tracking the gradient-based optimal production trajectory and an efficient model-updating (or history matching) – where well production data are used to systematically recalibrate reservoir model parameters in order to minimize the mismatch between actual and simulated measurements. Central to all of these problems is the use of white-box reservoir models which are employed in the well placement optimization and production settings optimization. However, a simple data-driven black-box model which results from the linearization of an identified nonlinear model is employed in the predictive controller algorithm. The benefits and efficiency of the approach in our work is demonstrated through the maximization of the NPV of waterflooded reservoir models that are subject to production and geological uncertainty. Our procedure provides an improvement in the NPV, and importantly, the predictive control algorithm ensures that this improved NPV are attainable as nearly as possible in practice

    Advances in Evolutionary Algorithms

    Get PDF
    With the recent trends towards massive data sets and significant computational power, combined with evolutionary algorithmic advances evolutionary computation is becoming much more relevant to practice. Aim of the book is to present recent improvements, innovative ideas and concepts in a part of a huge EA field

    Unconstrained Learning Machines

    Get PDF
    With the use of information technology in industries, a new need has arisen in analyzing large scale data sets and automating data analysis that was once performed by human intuition and simple analog processing machines. The new generation of computer programs now has to outperform their predecessors in detecting complex and non-trivial patterns buried in data warehouses. Improved Machines Learning (ML) techniques such as Neural Networks (NNs) and Support Vector Machines (SVMs) have shown remarkable performances on supervised learning problems for the past couple of decades (e.g. anomaly detection, classification and identification, interpolation and extrapolation, etc.).Nevertheless, many such techniques have ill-conditioned structures which lack adaptability for processing exotic data or very large amounts of data. Some techniques cannot even process data in an on-line fashion. Furthermore, as the processing power of computers increases, there is a pressing need for ML algorithms to perform supervised learning tasks in less time than previously required over even larger sets of data, which means that time and memory complexities of these algorithms must be improved.The aims of this research is to construct an improved type of SVM-like algorithms for tasks such as nonlinear classification and interpolation that is more scalable, error-tolerant and accurate. Additionally, this family of algorithms must be able to compute solutions in a controlled timing, preferably small with respect to modern computational technologies. These new algorithms should also be versatile enough to have useful applications in engineering, meteorology or quality control.This dissertation introduces a family of SVM-based algorithms named Unconstrained Learning Machines (ULMs) which attempt to solve the robustness, scalability and timing issues of traditional supervised learning algorithms. ULMs are not based on geometrical analogies (e.g. SVMs) or on the replication of biological models (e.g. NNs). Their construction is strictly based on statistical considerations taken from the recently developed statistical learning theory. Like SVMs, ULMS are using kernel methods extensively in order to process exotic and/or non-numerical objects stored in databases and search for hidden patterns in data with tailored measures of similarities.ULMs are applied to a variety of problems in manufacturing engineering and in meteorology. The robust nonlinear nonparametric interpolation abilities of ULMs allow for the representation of sub-millimetric deformations on the surface of manufactured parts, the selection of conforming objects and the diagnostic and modeling of manufacturing processes. ULMs play a role in assimilating the system states of computational weather models, removing the intrinsic noise without any knowledge of the underlying mathematical models and helping the establishment of more accurate forecasts

    Planning and Scheduling Optimization

    Get PDF
    Although planning and scheduling optimization have been explored in the literature for many years now, it still remains a hot topic in the current scientific research. The changing market trends, globalization, technical and technological progress, and sustainability considerations make it necessary to deal with new optimization challenges in modern manufacturing, engineering, and healthcare systems. This book provides an overview of the recent advances in different areas connected with operations research models and other applications of intelligent computing techniques used for planning and scheduling optimization. The wide range of theoretical and practical research findings reported in this book confirms that the planning and scheduling problem is a complex issue that is present in different industrial sectors and organizations and opens promising and dynamic perspectives of research and development
    • 

    corecore