1,488 research outputs found

    Seeking multiple solutions:an updated survey on niching methods and their applications

    Get PDF
    Multi-Modal Optimization (MMO) aiming to locate multiple optimal (or near-optimal) solutions in a single simulation run has practical relevance to problem solving across many fields. Population-based meta-heuristics have been shown particularly effective in solving MMO problems, if equipped with specificallydesigned diversity-preserving mechanisms, commonly known as niching methods. This paper provides an updated survey on niching methods. The paper first revisits the fundamental concepts about niching and its most representative schemes, then reviews the most recent development of niching methods, including novel and hybrid methods, performance measures, and benchmarks for their assessment. Furthermore, the paper surveys previous attempts at leveraging the capabilities of niching to facilitate various optimization tasks (e.g., multi-objective and dynamic optimization) and machine learning tasks (e.g., clustering, feature selection, and learning ensembles). A list of successful applications of niching methods to real-world problems is presented to demonstrate the capabilities of niching methods in providing solutions that are difficult for other optimization methods to offer. The significant practical value of niching methods is clearly exemplified through these applications. Finally, the paper poses challenges and research questions on niching that are yet to be appropriately addressed. Providing answers to these questions is crucial before we can bring more fruitful benefits of niching to real-world problem solving

    Drone Swarms in Adversarial Environment

    Get PDF
    Drones are unmanned aerial vehicles (UAVs) operated remotely with the help of cameras, GPS, and on-device SD cards. These are used for many applications including civilian as well as military. On the other hand, drone swarms are a fleet of drones that work together to achieve a special goal through swarm intelligence approaches. These provide a lot of advantages such as better coverage, accuracy, increased safety, and improved flexibility when compared to a single drone. However, the deployment of such swarms in an adversarial environment poses significant challenges. This work provides an overview of the current state of research on drone swarms in adversarial environments including algorithms for swarming formation of robotic attack drones with their strengths and weaknesses as well as the attack strategies used by attackers. This work also outlines the common adversarial counter-attack methods to disrupt drone attacks consisting of detection and destruction of drone swarms along with their drawbacks, a counter UAV defense system, and splitting large-scale drones into unconnected clusters. After identifying several challenges, an optimized algorithm is proposed to split the large-scale drone swarms more efficiently

    Towards Optimized K Means Clustering using Nature-inspired Algorithms for Software Bug Prediction

    Get PDF
    In today s software development environment the necessity for providing quality software products has undoubtedly remained the largest difficulty As a result early software bug prediction in the development phase is critical for lowering maintenance costs and improving overall software performance Clustering is a well-known unsupervised method for data classification and finding related patterns hidden in dataset

    Modeling and Optimal Design of Machining-Induced Residual Stresses in Aluminium Alloys Using a Fast Hierarchical Multiobjective Optimization Algorithm

    Get PDF
    The residual stresses induced during shaping and machining play an important role in determining the integrity and durability of metal components. An important issue of producing safety critical components is to find the machining parameters that create compressive surface stresses or minimise tensile surface stresses. In this paper, a systematic data-driven fuzzy modelling methodology is proposed, which allows constructing transparent fuzzy models considering both accuracy and interpretability attributes of fuzzy systems. The new method employs a hierarchical optimisation structure to improve the modelling efficiency, where two learning mechanisms cooperate together: NSGA-II is used to improve the model’s structure while the gradient descent method is used to optimise the numerical parameters. This hybrid approach is then successfully applied to the problem that concerns the prediction of machining induced residual stresses in aerospace aluminium alloys. Based on the developed reliable prediction models, NSGA-II is further applied to the multi-objective optimal design of aluminium alloys in a ‘reverse-engineering’ fashion. It is revealed that the optimal machining regimes to minimise the residual stress and the machining cost simultaneously can be successfully located

    Intelligent Processing in Wireless Communications Using Particle Swarm Based Methods

    Get PDF
    There are a lot of optimization needs in the research and design of wireless communica- tion systems. Many of these optimization problems are Nondeterministic Polynomial (NP) hard problems and could not be solved well. Many of other non-NP-hard optimization problems are combinatorial and do not have satisfying solutions either. This dissertation presents a series of Particle Swarm Optimization (PSO) based search and optimization algorithms that solve open research and design problems in wireless communications. These problems are either avoided or solved approximately before. PSO is a bottom-up approach for optimization problems. It imposes no conditions on the underlying problem. Its simple formulation makes it easy to implement, apply, extend and hybridize. The algorithm uses simple operators like adders, and multipliers to travel through the search space and the process requires just five simple steps. PSO is also easy to control because it has limited number of parameters and is less sensitive to parameters than other swarm intelligence algorithms. It is not dependent on initial points and converges very fast. Four types of PSO based approaches are proposed targeting four different kinds of problems in wireless communications. First, we use binary PSO and continuous PSO together to find optimal compositions of Gaussian derivative pulses to form several UWB pulses that not only comply with the FCC spectrum mask, but also best exploit the avail- able spectrum and power. Second, three different PSO based algorithms are developed to solve the NLOS/LOS channel differentiation, NLOS range error mitigation and multilateration problems respectively. Third, a PSO based search method is proposed to find optimal orthogonal code sets to reduce the inter carrier interference effects in an frequency redundant OFDM system. Fourth, a PSO based phase optimization technique is proposed in reducing the PAPR of an frequency redundant OFDM system. The PSO based approaches are compared with other canonical solutions for these communication problems and showed superior performance in many aspects. which are confirmed by analysis and simulation results provided respectively. Open questions and future ï»żOpen questions and future works for the dissertation are proposed to serve as a guide for the future research efforts

    Application of nature-inspired optimization algorithms to improve the production efficiency of small and medium-sized bakeries

    Get PDF
    Increasing production efficiency through schedule optimization is one of the most influential topics in operations research that contributes to decision-making process. It is the concept of allocating tasks among available resources within the constraints of any manufacturing facility in order to minimize costs. It is carried out by a model that resembles real-world task distribution with variables and relevant constraints in order to complete a planned production. In addition to a model, an optimizer is required to assist in evaluating and improving the task allocation procedure in order to maximize overall production efficiency. The entire procedure is usually carried out on a computer, where these two distinct segments combine to form a solution framework for production planning and support decision-making in various manufacturing industries. Small and medium-sized bakeries lack access to cutting-edge tools, and most of their production schedules are based on personal experience. This makes a significant difference in production costs when compared to the large bakeries, as evidenced by their market dominance. In this study, a hybrid no-wait flow shop model is proposed to produce a production schedule based on actual data, featuring the constraints of the production environment in small and medium-sized bakeries. Several single-objective and multi-objective nature-inspired optimization algorithms were implemented to find efficient production schedules. While makespan is the most widely used quality criterion of production efficiency because it dominates production costs, high oven idle time in bakeries also wastes energy. Combining these quality criteria allows for additional cost reduction due to energy savings as well as shorter production time. Therefore, to obtain the efficient production plan, makespan and oven idle time were included in the objectives of optimization. To find the optimal production planning for an existing production line, particle swarm optimization, simulated annealing, and the Nawaz-Enscore-Ham algorithms were used. The weighting factor method was used to combine two objectives into a single objective. The classical optimization algorithms were found to be good enough at finding optimal schedules in a reasonable amount of time, reducing makespan by 29 % and oven idle time by 8 % of one of the analyzed production datasets. Nonetheless, the algorithms convergence was found to be poor, with a lower probability of obtaining the best or nearly the best result. In contrast, a modified particle swarm optimization (MPSO) proposed in this study demonstrated significant improvement in convergence with a higher probability of obtaining better results. To obtain trade-offs between two objectives, state-of-the-art multi-objective optimization algorithms, non-dominated sorting genetic algorithm (NSGA-II), strength Pareto evolutionary algorithm, generalized differential evolution, improved multi-objective particle swarm optimization (OMOPSO) and speed-constrained multi-objective particle swarm optimization (SMPSO) were implemented. Optimization algorithms provided efficient production planning with up to a 12 % reduction in makespan and a 26 % reduction in oven idle time based on data from different production days. The performance comparison revealed a significant difference between these multi-objective optimization algorithms, with NSGA-II performing best and OMOPSO and SMPSO performing worst. Proofing is a key processing stage that contributes to the quality of the final product by developing flavor and fluffiness texture in bread. However, the duration of proofing is uncertain due to the complex interaction of multiple parameters: yeast condition, temperature in the proofing chamber, and chemical composition of flour. Due to the uncertainty of proofing time, a production plan optimized with the shortest makespan can be significantly inefficient. The computational results show that the schedules with the shortest and nearly shortest makespan have a significant (up to 18 %) increase in makespan due to proofing time deviation from expected duration. In this thesis, a method for developing resilient production planning that takes into account uncertain proofing time is proposed, so that even if the deviation in proofing time is extreme, the fluctuation in makespan is minimal. The experimental results with a production dataset revealed a proactive production plan, with only 5 minutes longer than the shortest makespan, but only 21 min fluctuating in makespan due to varying the proofing time from -10 % to +10 % of actual proofing time. This study proposed a common framework for small and medium-sized bakeries to improve their production efficiency in three steps: collecting production data, simulating production planning with the hybrid no-wait flow shop model, and running the optimization algorithm. The study suggests to use MPSO for solving single objective optimization problem and NSGA-II for multi-objective optimization problem. Based on real bakery production data, the results revealed that existing plans were significantly inefficient and could be optimized in a reasonable computational time using a robust optimization algorithm. Implementing such a framework in small and medium-sized bakery manufacturing operations could help to achieve an efficient and resilient production system.Die Steigerung der Produktionseffizienz durch die Optimierung von ArbeitsplĂ€nen ist eines der am meisten erforschten Themen im Bereich der Unternehmensplanung, die zur Entscheidungsfindung beitrĂ€gt. Es handelt sich dabei um die Aufteilung von Aufgaben auf die verfĂŒgbaren Ressourcen innerhalb der BeschrĂ€nkungen einer Produktionsanlage mit dem Ziel der Kostenminimierung. Diese Optimierung von ArbeitsplĂ€nen wird mit Hilfe eines Modells durchgefĂŒhrt, das die Aufgabenverteilung in der realen Welt mit Variablen und relevanten EinschrĂ€nkungen nachbildet, um die Produktion zu simulieren. ZusĂ€tzlich zu einem Modell sind Optimierungsverfahren erforderlich, die bei der Bewertung und Verbesserung der Aufgabenverteilung helfen, um eine effiziente Gesamtproduktion zu erzielen. Das gesamte Verfahren wird in der Regel auf einem Computer durchgefĂŒhrt, wobei diese beiden unterschiedlichen Komponenten (Modell und Optimierungsverfahren) zusammen einen Lösungsrahmen fĂŒr die Produktionsplanung bilden und die Entscheidungsfindung in verschiedenen Fertigungsindustrien unterstĂŒtzen. Kleine und mittelgroße BĂ€ckereien haben zumeist keinen Zugang zu den modernsten Werkzeugen und die meisten ihrer ProduktionsplĂ€ne beruhen auf persönlichen Erfahrungen. Dies macht einen erheblichen Unterschied bei den Produktionskosten im Vergleich zu den großen BĂ€ckereien aus, was sich in deren Marktdominanz widerspiegelt. In dieser Studie wird ein hybrides No-Wait-Flow-Shop-Modell vorgeschlagen, um einen Produktionsplan auf der Grundlage tatsĂ€chlicher Daten zu erstellen, der die BeschrĂ€nkungen der Produktionsumgebung in kleinen und mittleren BĂ€ckereien berĂŒcksichtigt. Mehrere einzel- und mehrzielorientierte, von der Natur inspirierte Optimierungsalgorithmen wurden implementiert, um effiziente ProduktionsplĂ€ne zu berechnen. Die Minimierung der Produktionsdauer ist das am hĂ€ufigsten verwendete QualitĂ€tskriterium fĂŒr die Produktionseffizienz, da sie die Produktionskosten dominiert. Jedoch wird in BĂ€ckereien durch hohe Leerlaufzeiten der Öfen Energie verschwendet was wiederum die Produktionskosten erhöht. Die Kombination beider QualitĂ€tskriterien (minimale Produktionskosten, minimale Leerlaufzeiten der Öfen) ermöglicht eine zusĂ€tzliche Kostenreduzierung durch Energieeinsparungen und kurze Produktionszeiten. Um einen effizienten Produktionsplan zu erhalten, wurden daher die Minimierung der Produktionsdauer und der Ofenleerlaufzeit in die Optimierungsziele einbezogen. Um optimale ProduktionsplĂ€ne fĂŒr bestehende Produktionsprozesse von BĂ€ckereien zu ermitteln, wurden folgende Algorithmen untersucht: Particle Swarm Optimization, Simulated Annealing und Nawaz-Enscore-Ham. Die Methode der Gewichtung wurde verwendet, um zwei Ziele zu einem einzigen Ziel zu kombinieren. Die Optimierungsalgorithmen erwiesen sich als gut genug, um in angemessener Zeit optimale PlĂ€ne zu berechnen, wobei bei einem untersuchten Datensatz die Produktionsdauer um 29 % und die Leerlaufzeit des Ofens um 8 % reduziert wurde. Allerdings erwies sich die Konvergenz der Algorithmen als unzureichend, da nur mit einer geringen Wahrscheinlichkeit das beste oder nahezu beste Ergebnis berechnet wurde. Im Gegensatz dazu zeigte der in dieser Studie ebenfalls untersuchte modifizierte Particle-swarm-Optimierungsalgorithmus (mPSO) eine deutliche Verbesserung der Konvergenz mit einer höheren Wahrscheinlichkeit, bessere Ergebnisse zu erzielen im Vergleich zu den anderen Algorithmen. Um Kompromisse zwischen zwei Zielen zu erzielen, wurden moderne Algorithmen zur Mehrzieloptimierung implementiert: Non-dominated Sorting Genetic Algorithm (NSGA-II), Strength Pareto Evolutionary Algorithm, Generalized Differential Evolution, Improved Multi-objective Particle Swarm Optimization (OMOPSO), and Speed-constrained Multi-objective Particle Swarm Optimization (SMPSO). Die Optimierungsalgorithmen ermöglichten eine effiziente Produktionsplanung mit einer Verringerung der Produktionsdauer um bis zu 12 % und einer Verringerung der Leerlaufzeit der Öfen um 26 % auf der Grundlage von Daten aus unterschiedlichen Produktionsprozessen. Der Leistungsvergleich zeigte signifikante Unterschiede zwischen diesen Mehrziel-Optimierungsalgorithmen, wobei NSGA-II am besten und OMOPSO und SMPSO am schlechtesten abschnitten. Die GĂ€rung ist ein wichtiger Verarbeitungsschritt, der zur QualitĂ€t des Endprodukts beitrĂ€gt, indem der Geschmack und die Textur des Brotes positiv beeinflusst werden kann. Die Dauer der GĂ€rung ist jedoch aufgrund der komplexen Interaktion von mehreren GrĂ¶ĂŸen abhĂ€ngig wie der Hefezustand, der Temperatur in der GĂ€rkammer und der chemischen Zusammensetzung des Mehls. Aufgrund der VariabilitĂ€t der GĂ€rzeit kann jedoch ein Produktionsplan, der auf die kĂŒrzeste Produktionszeit optimiert ist, sehr ineffizient sein. Die Berechnungsergebnisse zeigen, dass die PlĂ€ne mit der kĂŒrzesten und nahezu kĂŒrzesten Produktionsdauer eine erhebliche (bis zu 18 %) Erhöhung der Produktionsdauer aufgrund der Abweichung der GĂ€rzeit von der erwarteten Dauer aufweisen. In dieser Arbeit wird eine Methode zur Entwicklung einer robusten Produktionsplanung vorgeschlagen, die VerĂ€nderungen in den GĂ€rzeiten berĂŒcksichtigt, so dass selbst bei einer extremen Abweichung der GĂ€rzeit die Schwankung der Produktionsdauer minimal ist. Die experimentellen Ergebnisse fĂŒr einen Produktionsprozess ergaben einen robusten Produktionsplan, der nur 5 Minuten lĂ€nger ist als die kĂŒrzeste Produktionsdauer, aber nur 21 Minuten in der Produktionsdauer schwankt, wenn die GĂ€rzeit von -10 % bis +10 % der ermittelten GĂ€rzeit variiert. In dieser Studie wird ein Vorgehen fĂŒr kleine und mittlere BĂ€ckereien vorgeschlagen, um ihre Produktionseffizienz in drei Schritten zu verbessern: Erfassung von Produktionsdaten, Simulation von ProduktionsplĂ€nen mit dem hybrid No-Wait Flow Shop Modell und AusfĂŒhrung der Optimierung. FĂŒr die Einzieloptimierung wird der mPSO-Algorithmus und fĂŒr die Mehrzieloptimierung NSGA-II-Algorithmus empfohlen. Auf der Grundlage realer BĂ€ckereiproduktionsdaten zeigten die Ergebnisse, dass die in den BĂ€ckereien verwendeten PlĂ€ne ineffizient waren und mit Hilfe eines effizienten Optimierungsalgorithmus in einer angemessenen Rechenzeit optimiert werden konnten. Die Umsetzung eines solchen Vorgehens in kleinen und mittelgroßen BĂ€ckereibetrieben trĂ€gt dazu bei effiziente und robuste ProduktionsplĂ€ne zu erstellen und somit die WettbewerbsfĂ€higkeit dieser BĂ€ckereien zu erhöhen

    Photonic Structures Optimization Using Highly Data-Efficient Deep Learning: Application To Nanofin And Annular Groove Phase Masks

    Full text link
    Metasurfaces offer a flexible framework for the manipulation of light properties in the realm of thin film optics. Specifically, the polarization of light can be effectively controlled through the use of thin phase plates. This study aims to introduce a surrogate optimization framework for these devices. The framework is applied to develop two kinds of vortex phase masks (VPMs) tailored for application in astronomical high-contrast imaging. Computational intelligence techniques are exploited to optimize the geometric features of these devices. The large design space and computational limitations necessitate the use of surrogate models like partial least squares Kriging, radial basis functions, or neural networks. However, we demonstrate the inadequacy of these methods in modeling the performance of VPMs. To address the shortcomings of these methods, a data-efficient evolutionary optimization setup using a deep neural network as a highly accurate and efficient surrogate model is proposed. The optimization process in this study employs a robust particle swarm evolutionary optimization scheme, which operates on explicit geometric parameters of the photonic device. Through this approach, optimal designs are developed for two design candidates. In the most complex case, evolutionary optimization enables optimization of the design that would otherwise be impractical (requiring too much simulations). In both cases, the surrogate model improves the reliability and efficiency of the procedure, effectively reducing the required number of simulations by up to 75% compared to conventional optimization techniques

    Design and Analysis of Enhanced LEACH based Energy Routing Protocol for Wireless Sensor Network

    Get PDF
    In recent times, wireless sensor networks, or WSNs, have attracted a lot of attention because of their extensive use in a variety of fields, such as industrial automation, healthcare, and environmental monitoring. Energy efficiency is a major problem for WSNs since sensor nodes frequently run on batteries and have little energy available. Effective routing techniques are essential for extending the life of the network and guaranteeing dependable data transfer. This work focuses on the performance analysis and numerical modeling of a new routing strategy that combines machine learning approaches to improve WSN energy efficiency. The suggested routing algorithm optimizes energy consumption and overall network performance by adjusting its recommendations in real-time in response to environmental and network variables. We assess this machine learning-based routing protocol's performance using large-scale numerical simulations, contrasting it with conventional routing protocols and emphasizing its possible advantages in terms of energy efficiency and dependable data delivery. We investigate a variety of situations in our simulations, taking into account different network topologies, traffic patterns, and environmental factors. We evaluate many measures, including energy consumption, network lifetime, packet delivery ratio, and end-to-end delay, in order to offer a thorough evaluation of the efficacy of the machine learning-based routing protocol. The outcomes show how energy-efficient the protocol is, guaranteeing long-lasting sensor nodes and reliable data transfer while adjusting to changing network conditions.The results of this study highlight how machine learning approaches can completely change how routing protocols are designed and optimized in wireless sensor networks with limited energy. This research helps to construct sustainable and dependable WSNs by enhancing energy efficiency and network performance, which makes it easier to deploy sensor networks in crucial applications

    The political economy of financial repression in transition economies

    Get PDF
    Financial systems in developing countries tend to be"restricted"or"repressed"through burdensome reserve requirements, interest-rate ceilings, foreign-exchange regulations, rules about the composition of bank balance sheets, or heavy taxation of the financial sector. Why are governments drawn to regulate financial markets to the point of financial repression? To address this question, the authors explore preliminary evidence from the post-Communist economies of Eastern Europe and the former Soviet Union, where financial regulations have rarely been examined systematically. They find that public-finance framework has limited ability to explain financial repression in the transition economies, given the peculiar financial lineage of the socialist state. The weak distinction between"public"and"private"spheres of finance in transition economies means that the deficit often conveys little information about the governments'real fiscal activities. It is more fruitful to examine how political institutions, by shaping the incentives politicians face, affect financial policy. Their findings suggest that post-Communist governments may adopt repressive financial controls - not to finance deficits more cheaply than would be the caseunder financial liberalization, but to maintain the authority and ensure the survival of those in power. In countries where pre-reform elites are plentiful in legislative bodies, where interparty competition is low, and where government parties are well-represented in parliaments, elites have been able to perpetuate a system of implicit subsidies by"softening up"the financial sector - especially commercial banks - to ensure the continued flow of cheap credit to specific borrowers. The main beneficiaries of these policies - large formerly state-owned industries with tight financial links to the largest commercial banks - are thus able to convert their well-established claims on public resources into preferential access to credit lines. In other words, financial repression in transition economies may simply serve to solidify main-bank, main-firm relations. These results would lend support to the claim of smaller, cash-starved Eastern European entrepreneurs that the commercial banks have"taken over the role of the old planning ministries."Banks&Banking Reform,Financial Intermediation,Payment Systems&Infrastructure,Environmental Economics&Policies,Economic Theory&Research,Environmental Economics&Policies,National Governance,Financial Intermediation,Banks&Banking Reform,Economic Theory&Research

    K-means online-learning routing protocol (K-MORP) for unmanned aerial vehicles (UAV) adhoc networks

    Get PDF
    Unmanned Aerial Vehicles (UAVs) have become a hot topic due to their flexible architecture adopted in many wireless technologies. In UAV ad hoc networks, traditional routing protocols with a fixed topology are ineffective due to dynamic mobility and unstable paths. Therefore, the mobility patterns of UAVs challenge efficient and reliable routing in UAV networks. Traditional routing algorithms are often based on assumptions of static nodes and predetermined network topologies. Which are not suitable for the dynamic and unpredictable nature of UAV mobility patterns. To address this problem, this paper introduces a K-means online learning routing protocol (KMORP) scheme employing a Markov mobility model for UAV ad hoc networks. Initially, the proposed method utilizes a 3D Gauss Markov mobility model to accurately estimate UAV positions, while K-means online learning is adopted for dynamic clustering and load balancing. Designed for real-time data processing, KMORP is well suited for UAV ad hoc networks, quickly adapting to network environmental changes such as UAV mobility, interference, and signal degradation to ensure efficient data transmission and communication. This is achieved while reducing the overall communication overhead and increasing the packet delivery ratio(PDR%). In the routing phase, the proposed scheme employs inter-cluster forwarding nodes to transmit messages among different clusters. Extensive simulations demonstrate the performance of the proposed KMORP, showing a 38% better PDR compared to OLSR and over 50% less end-to-end(E2E) delay compared to typical K-Means. Furthermore, the proposed KMORP exhibited an average throughput of 955 kbps, showing a substantial improvement in network performance. The results underscore that the proposed KMORP outperforms existing techniques in terms of PDR, E2E delay, and throughput.© 2023 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).fi=vertaisarvioitu|en=peerReviewed
    • 

    corecore