2,040 research outputs found

    Facility layout problem: Bibliometric and benchmarking analysis

    Get PDF
    Facility layout problem is related to the location of departments in a facility area, with the aim of determining the most effective configuration. Researches based on different approaches have been published in the last six decades and, to prove the effectiveness of the results obtained, several instances have been developed. This paper presents a general overview on the extant literature on facility layout problems in order to identify the main research trends and propose future research questions. Firstly, in order to give the reader an overview of the literature, a bibliometric analysis is presented. Then, a clusterization of the papers referred to the main instances reported in literature was carried out in order to create a database that can be a useful tool in the benchmarking procedure for researchers that would approach this kind of problems

    A comprehensive survey on cultural algorithms

    Get PDF
    Peer reviewedPostprin

    Agent-based modelling and Swarm Intelligence in systems engineering

    Get PDF
    El objetivo de la tesis doctoral es evaluar la utilidad de las técnicas Modelado Basado en Agentes, algoritmos de optimización Swarm Intelligence y programación paralela sobre tarjeta gráfica en el campo de la Ingeniería de Sistemas y Automática. Se ha realizado un revisión bibliográfica y desarrollado un marco de desarrollo de la técnica de Modelado Basado en Agentes. Esta técnica se ha empleado para realizar un modelo de un reactor de fangos activados (que se engloba dentro del proceso de depuración de aguas residuales). Se ha desarrollado una notación complementaria para la descripción de modelos basados en agentes desde el punto de vista de la ingeniería de sistemas. Se ha presentado asimismo un algoritmo de optimización basado en agentes bajo la filosofía Swarm Intelligence. Se han trabajado con las técnicas de paralelización sobre tarjeta gráfica para reducir los tiempos de simulación de modelos y algoritmos. Se trata por lo tanto de un tesis de integración de varias tecnologías.Departamento de Ingeniería de Sistemas y Automátic

    Intervention in the social population space of Cultural Algorithm

    Get PDF
    Cultural Algorithms (CA) offers a better way to simulate social and culture driven agents by introducing the notion of culture into the artificial population. When it comes to mimic intelligent social beings such as humans, the search for a better fit or global optima becomes multi dimensional because of the complexity produced by the relevant system parameters and intricate social behaviour. In this research an extended CA framework has been presented. The architecture provides extensions to the basic CA framework. The major extensions include the mechanism of influencing selected individuals into the population space by means of existing social network and consequently alter the cultural belief favourably. Another extension of the framework was done in the population space by introducing the concept of social network. The agents in the population are put into one (or more) network through which they can communicate and propagate knowledge. Identification and exploitation of such network is necessary sinceit may lead to a quicker shift of the cultural norm

    Cultural particle swarm optimization

    Get PDF

    Assessment of voltage stability based on power transfer stability index using computational intelligence models

    Get PDF
    In this paper, the importance of voltage stability is explained, which is a great problem in the EPS. The estimation of VS is made a priority so as to make the power system stable and prevent it from reaching voltage collapse. The power transfer stability index (PTSI) is used as a predictor utilized in a PSN to detect the instability of voltages on weakened buses. A PSI is used to obtain a voltage assessment of the PSNs. Two hybrid algorithms are developed. The (CA-NN) and the (PSO-NN). After developing algorithms, they are compared with the actual values of PTSI NR method. The algorithms installed on the 24 bus Iraqi PS. The actual values of PTSI are the targets needed. They are obtained from the NR algorithm when the input data is Vi, δi, Pd, Qd for the algorithm. The results indicate that a weak bus that approaches voltage collapse and all results were approximately the same. There is a slight difference with the actual results and demonstrated classical methods are slower and less accurate than the hybrid algorithms. It also demonstrates the validation and effectiveness of algorithms (CA-NN, and PSO-NN) for assessing voltage-prioritizing algorithms (CA-NN). The MATLAB utilized to obtain most of the results

    Application of nature-inspired optimization algorithms to improve the production efficiency of small and medium-sized bakeries

    Get PDF
    Increasing production efficiency through schedule optimization is one of the most influential topics in operations research that contributes to decision-making process. It is the concept of allocating tasks among available resources within the constraints of any manufacturing facility in order to minimize costs. It is carried out by a model that resembles real-world task distribution with variables and relevant constraints in order to complete a planned production. In addition to a model, an optimizer is required to assist in evaluating and improving the task allocation procedure in order to maximize overall production efficiency. The entire procedure is usually carried out on a computer, where these two distinct segments combine to form a solution framework for production planning and support decision-making in various manufacturing industries. Small and medium-sized bakeries lack access to cutting-edge tools, and most of their production schedules are based on personal experience. This makes a significant difference in production costs when compared to the large bakeries, as evidenced by their market dominance. In this study, a hybrid no-wait flow shop model is proposed to produce a production schedule based on actual data, featuring the constraints of the production environment in small and medium-sized bakeries. Several single-objective and multi-objective nature-inspired optimization algorithms were implemented to find efficient production schedules. While makespan is the most widely used quality criterion of production efficiency because it dominates production costs, high oven idle time in bakeries also wastes energy. Combining these quality criteria allows for additional cost reduction due to energy savings as well as shorter production time. Therefore, to obtain the efficient production plan, makespan and oven idle time were included in the objectives of optimization. To find the optimal production planning for an existing production line, particle swarm optimization, simulated annealing, and the Nawaz-Enscore-Ham algorithms were used. The weighting factor method was used to combine two objectives into a single objective. The classical optimization algorithms were found to be good enough at finding optimal schedules in a reasonable amount of time, reducing makespan by 29 % and oven idle time by 8 % of one of the analyzed production datasets. Nonetheless, the algorithms convergence was found to be poor, with a lower probability of obtaining the best or nearly the best result. In contrast, a modified particle swarm optimization (MPSO) proposed in this study demonstrated significant improvement in convergence with a higher probability of obtaining better results. To obtain trade-offs between two objectives, state-of-the-art multi-objective optimization algorithms, non-dominated sorting genetic algorithm (NSGA-II), strength Pareto evolutionary algorithm, generalized differential evolution, improved multi-objective particle swarm optimization (OMOPSO) and speed-constrained multi-objective particle swarm optimization (SMPSO) were implemented. Optimization algorithms provided efficient production planning with up to a 12 % reduction in makespan and a 26 % reduction in oven idle time based on data from different production days. The performance comparison revealed a significant difference between these multi-objective optimization algorithms, with NSGA-II performing best and OMOPSO and SMPSO performing worst. Proofing is a key processing stage that contributes to the quality of the final product by developing flavor and fluffiness texture in bread. However, the duration of proofing is uncertain due to the complex interaction of multiple parameters: yeast condition, temperature in the proofing chamber, and chemical composition of flour. Due to the uncertainty of proofing time, a production plan optimized with the shortest makespan can be significantly inefficient. The computational results show that the schedules with the shortest and nearly shortest makespan have a significant (up to 18 %) increase in makespan due to proofing time deviation from expected duration. In this thesis, a method for developing resilient production planning that takes into account uncertain proofing time is proposed, so that even if the deviation in proofing time is extreme, the fluctuation in makespan is minimal. The experimental results with a production dataset revealed a proactive production plan, with only 5 minutes longer than the shortest makespan, but only 21 min fluctuating in makespan due to varying the proofing time from -10 % to +10 % of actual proofing time. This study proposed a common framework for small and medium-sized bakeries to improve their production efficiency in three steps: collecting production data, simulating production planning with the hybrid no-wait flow shop model, and running the optimization algorithm. The study suggests to use MPSO for solving single objective optimization problem and NSGA-II for multi-objective optimization problem. Based on real bakery production data, the results revealed that existing plans were significantly inefficient and could be optimized in a reasonable computational time using a robust optimization algorithm. Implementing such a framework in small and medium-sized bakery manufacturing operations could help to achieve an efficient and resilient production system.Die Steigerung der Produktionseffizienz durch die Optimierung von Arbeitsplänen ist eines der am meisten erforschten Themen im Bereich der Unternehmensplanung, die zur Entscheidungsfindung beiträgt. Es handelt sich dabei um die Aufteilung von Aufgaben auf die verfügbaren Ressourcen innerhalb der Beschränkungen einer Produktionsanlage mit dem Ziel der Kostenminimierung. Diese Optimierung von Arbeitsplänen wird mit Hilfe eines Modells durchgeführt, das die Aufgabenverteilung in der realen Welt mit Variablen und relevanten Einschränkungen nachbildet, um die Produktion zu simulieren. Zusätzlich zu einem Modell sind Optimierungsverfahren erforderlich, die bei der Bewertung und Verbesserung der Aufgabenverteilung helfen, um eine effiziente Gesamtproduktion zu erzielen. Das gesamte Verfahren wird in der Regel auf einem Computer durchgeführt, wobei diese beiden unterschiedlichen Komponenten (Modell und Optimierungsverfahren) zusammen einen Lösungsrahmen für die Produktionsplanung bilden und die Entscheidungsfindung in verschiedenen Fertigungsindustrien unterstützen. Kleine und mittelgroße Bäckereien haben zumeist keinen Zugang zu den modernsten Werkzeugen und die meisten ihrer Produktionspläne beruhen auf persönlichen Erfahrungen. Dies macht einen erheblichen Unterschied bei den Produktionskosten im Vergleich zu den großen Bäckereien aus, was sich in deren Marktdominanz widerspiegelt. In dieser Studie wird ein hybrides No-Wait-Flow-Shop-Modell vorgeschlagen, um einen Produktionsplan auf der Grundlage tatsächlicher Daten zu erstellen, der die Beschränkungen der Produktionsumgebung in kleinen und mittleren Bäckereien berücksichtigt. Mehrere einzel- und mehrzielorientierte, von der Natur inspirierte Optimierungsalgorithmen wurden implementiert, um effiziente Produktionspläne zu berechnen. Die Minimierung der Produktionsdauer ist das am häufigsten verwendete Qualitätskriterium für die Produktionseffizienz, da sie die Produktionskosten dominiert. Jedoch wird in Bäckereien durch hohe Leerlaufzeiten der Öfen Energie verschwendet was wiederum die Produktionskosten erhöht. Die Kombination beider Qualitätskriterien (minimale Produktionskosten, minimale Leerlaufzeiten der Öfen) ermöglicht eine zusätzliche Kostenreduzierung durch Energieeinsparungen und kurze Produktionszeiten. Um einen effizienten Produktionsplan zu erhalten, wurden daher die Minimierung der Produktionsdauer und der Ofenleerlaufzeit in die Optimierungsziele einbezogen. Um optimale Produktionspläne für bestehende Produktionsprozesse von Bäckereien zu ermitteln, wurden folgende Algorithmen untersucht: Particle Swarm Optimization, Simulated Annealing und Nawaz-Enscore-Ham. Die Methode der Gewichtung wurde verwendet, um zwei Ziele zu einem einzigen Ziel zu kombinieren. Die Optimierungsalgorithmen erwiesen sich als gut genug, um in angemessener Zeit optimale Pläne zu berechnen, wobei bei einem untersuchten Datensatz die Produktionsdauer um 29 % und die Leerlaufzeit des Ofens um 8 % reduziert wurde. Allerdings erwies sich die Konvergenz der Algorithmen als unzureichend, da nur mit einer geringen Wahrscheinlichkeit das beste oder nahezu beste Ergebnis berechnet wurde. Im Gegensatz dazu zeigte der in dieser Studie ebenfalls untersuchte modifizierte Particle-swarm-Optimierungsalgorithmus (mPSO) eine deutliche Verbesserung der Konvergenz mit einer höheren Wahrscheinlichkeit, bessere Ergebnisse zu erzielen im Vergleich zu den anderen Algorithmen. Um Kompromisse zwischen zwei Zielen zu erzielen, wurden moderne Algorithmen zur Mehrzieloptimierung implementiert: Non-dominated Sorting Genetic Algorithm (NSGA-II), Strength Pareto Evolutionary Algorithm, Generalized Differential Evolution, Improved Multi-objective Particle Swarm Optimization (OMOPSO), and Speed-constrained Multi-objective Particle Swarm Optimization (SMPSO). Die Optimierungsalgorithmen ermöglichten eine effiziente Produktionsplanung mit einer Verringerung der Produktionsdauer um bis zu 12 % und einer Verringerung der Leerlaufzeit der Öfen um 26 % auf der Grundlage von Daten aus unterschiedlichen Produktionsprozessen. Der Leistungsvergleich zeigte signifikante Unterschiede zwischen diesen Mehrziel-Optimierungsalgorithmen, wobei NSGA-II am besten und OMOPSO und SMPSO am schlechtesten abschnitten. Die Gärung ist ein wichtiger Verarbeitungsschritt, der zur Qualität des Endprodukts beiträgt, indem der Geschmack und die Textur des Brotes positiv beeinflusst werden kann. Die Dauer der Gärung ist jedoch aufgrund der komplexen Interaktion von mehreren Größen abhängig wie der Hefezustand, der Temperatur in der Gärkammer und der chemischen Zusammensetzung des Mehls. Aufgrund der Variabilität der Gärzeit kann jedoch ein Produktionsplan, der auf die kürzeste Produktionszeit optimiert ist, sehr ineffizient sein. Die Berechnungsergebnisse zeigen, dass die Pläne mit der kürzesten und nahezu kürzesten Produktionsdauer eine erhebliche (bis zu 18 %) Erhöhung der Produktionsdauer aufgrund der Abweichung der Gärzeit von der erwarteten Dauer aufweisen. In dieser Arbeit wird eine Methode zur Entwicklung einer robusten Produktionsplanung vorgeschlagen, die Veränderungen in den Gärzeiten berücksichtigt, so dass selbst bei einer extremen Abweichung der Gärzeit die Schwankung der Produktionsdauer minimal ist. Die experimentellen Ergebnisse für einen Produktionsprozess ergaben einen robusten Produktionsplan, der nur 5 Minuten länger ist als die kürzeste Produktionsdauer, aber nur 21 Minuten in der Produktionsdauer schwankt, wenn die Gärzeit von -10 % bis +10 % der ermittelten Gärzeit variiert. In dieser Studie wird ein Vorgehen für kleine und mittlere Bäckereien vorgeschlagen, um ihre Produktionseffizienz in drei Schritten zu verbessern: Erfassung von Produktionsdaten, Simulation von Produktionsplänen mit dem hybrid No-Wait Flow Shop Modell und Ausführung der Optimierung. Für die Einzieloptimierung wird der mPSO-Algorithmus und für die Mehrzieloptimierung NSGA-II-Algorithmus empfohlen. Auf der Grundlage realer Bäckereiproduktionsdaten zeigten die Ergebnisse, dass die in den Bäckereien verwendeten Pläne ineffizient waren und mit Hilfe eines effizienten Optimierungsalgorithmus in einer angemessenen Rechenzeit optimiert werden konnten. Die Umsetzung eines solchen Vorgehens in kleinen und mittelgroßen Bäckereibetrieben trägt dazu bei effiziente und robuste Produktionspläne zu erstellen und somit die Wettbewerbsfähigkeit dieser Bäckereien zu erhöhen

    A multi-cycled sequential memetic computing approach for constrained optimisation

    Get PDF
    In this paper, we propose a multi-cycled sequential memetic computing structure for constrained optimisation. The structure is composed of multiple evolutionary cycles. At each cycle, an evolutionary algorithm is considered as an operator, and connects with a local optimiser. This structure enables the learning of useful knowledge from previous cycles and the transfer of the knowledge to facilitate search in latter cycles. Specifically, we propose to apply an estimation of distribution algorithm (EDA) to explore the search space until convergence at each cycle. A local optimiser, called DONLP2, is then applied to improve the best solution found by the EDA. New cycle starts after the local improvement if the computation budget has not been exceeded. In the developed EDA, an adaptive fully-factorized multivariate probability model is proposed. A learning mechanism, implemented as the guided mutation operator, is adopted to learn useful knowledge from previous cycles. The developed algorithm was experimentally studied on the benchmark problems in the CEC 2006 and 2010 competition. Experimental studies have shown that the developed probability model exhibits excellent exploration capability and the learning mechanism can significantly improve the search efficiency under certain conditions. The comparison against some well-known algorithms showed the superiority of the developed algorithm in terms of the consumed fitness evaluations and the solution quality

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    Improving Robustness in Social Fabric-based Cultural Algorithms

    Get PDF
    In this thesis, we propose two new approaches which aim at improving robustness in social fabric-based cultural algorithms. Robustness is one of the most significant issues when designing evolutionary algorithms. These algorithms should be capable of adapting themselves to various search landscapes. In the first proposed approach, we utilize the dynamics of social interactions in solving complex and multi-modal problems. In the literature of Cultural Algorithms, Social fabric has been suggested as a new method to use social phenomena to improve the search process of CAs. In this research, we introduce the Irregular Neighborhood Restructuring as a new adaptive method to allow individuals to rearrange their neighborhoods to avoid local optima or stagnation during the search process. In the second approach, we apply the concept of Confidence Interval from Inferential Statistics to improve the performance of knowledge sources in the Belief Space. This approach aims at improving the robustness and accuracy of the normative knowledge source. It is supposed to be more stable against sudden changes in the values of incoming solutions. The IEEE-CEC2015 benchmark optimization functions are used to evaluate our proposed methods against standard versions of CA and Social Fabric. IEEE-CEC2015 is a set of 15 multi-modal and hybrid functions which are used as a standard benchmark to evaluate optimization algorithms. We observed that both of the proposed approaches produce promising results on the majority of benchmark functions. Finally, we state that our proposed strategies enhance the robustness of the social fabric-based CAs against challenges such as multi-modality, copious local optima, and diverse landscapes
    corecore