27 research outputs found

    Intelligent Autonomous Decision-Making and Cooperative Control Technology of High-Speed Vehicle Swarms

    Get PDF
    This book is a reprint of the Special Issue “Intelligent Autonomous Decision-Making and Cooperative Control Technology of High-Speed Vehicle Swarms”,which was published in Applied Sciences

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Planning and Scheduling Optimization

    Get PDF
    Although planning and scheduling optimization have been explored in the literature for many years now, it still remains a hot topic in the current scientific research. The changing market trends, globalization, technical and technological progress, and sustainability considerations make it necessary to deal with new optimization challenges in modern manufacturing, engineering, and healthcare systems. This book provides an overview of the recent advances in different areas connected with operations research models and other applications of intelligent computing techniques used for planning and scheduling optimization. The wide range of theoretical and practical research findings reported in this book confirms that the planning and scheduling problem is a complex issue that is present in different industrial sectors and organizations and opens promising and dynamic perspectives of research and development

    Co-evolutionary Hybrid Bi-level Optimization

    Get PDF
    Multi-level optimization stems from the need to tackle complex problems involving multiple decision makers. Two-level optimization, referred as ``Bi-level optimization'', occurs when two decision makers only control part of the decision variables but impact each other (e.g., objective value, feasibility). Bi-level problems are sequential by nature and can be represented as nested optimization problems in which one problem (the ``upper-level'') is constrained by another one (the ``lower-level''). The nested structure is a real obstacle that can be highly time consuming when the lower-level is NPhard\mathcal{NP}-hard. Consequently, classical nested optimization should be avoided. Some surrogate-based approaches have been proposed to approximate the lower-level objective value function (or variables) to reduce the number of times the lower-level is globally optimized. Unfortunately, such a methodology is not applicable for large-scale and combinatorial bi-level problems. After a deep study of theoretical properties and a survey of the existing applications being bi-level by nature, problems which can benefit from a bi-level reformulation are investigated. A first contribution of this work has been to propose a novel bi-level clustering approach. Extending the well-know ``uncapacitated k-median problem'', it has been shown that clustering can be easily modeled as a two-level optimization problem using decomposition techniques. The resulting two-level problem is then turned into a bi-level problem offering the possibility to combine distance metrics in a hierarchical manner. The novel bi-level clustering problem has a very interesting property that enable us to tackle it with classical nested approaches. Indeed, its lower-level problem can be solved in polynomial time. In cooperation with the Luxembourg Centre for Systems Biomedicine (LCSB), this new clustering model has been applied on real datasets such as disease maps (e.g. Parkinson, Alzheimer). Using a novel hybrid and parallel genetic algorithm as optimization approach, the results obtained after a campaign of experiments have the ability to produce new knowledge compared to classical clustering techniques combining distance metrics in a classical manner. The previous bi-level clustering model has the advantage that the lower-level can be solved in polynomial time although the global problem is by definition NP\mathcal{NP}-hard. Therefore, next investigations have been undertaken to tackle more general bi-level problems in which the lower-level problem does not present any specific advantageous properties. Since the lower-level problem can be very expensive to solve, the focus has been turned to surrogate-based approaches and hyper-parameter optimization techniques with the aim of approximating the lower-level problem and reduce the number of global lower-level optimizations. Adapting the well-know bayesian optimization algorithm to solve general bi-level problems, the expensive lower-level optimizations have been dramatically reduced while obtaining very accurate solutions. The resulting solutions and the number of spared lower-level optimizations have been compared to the bi-level evolutionary algorithm based on quadratic approximations (BLEAQ) results after a campaign of experiments on official bi-level benchmarks. Although both approaches are very accurate, the bi-level bayesian version required less lower-level objective function calls. Surrogate-based approaches are restricted to small-scale and continuous bi-level problems although many real applications are combinatorial by nature. As for continuous problems, a study has been performed to apply some machine learning strategies. Instead of approximating the lower-level solution value, new approximation algorithms for the discrete/combinatorial case have been designed. Using the principle employed in GP hyper-heuristics, heuristics are trained in order to tackle efficiently the NPhard\mathcal{NP}-hard lower-level of bi-level problems. This automatic generation of heuristics permits to break the nested structure into two separated phases: \emph{training lower-level heuristics} and \emph{solving the upper-level problem with the new heuristics}. At this occasion, a second modeling contribution has been introduced through a novel large-scale and mixed-integer bi-level problem dealing with pricing in the cloud, i.e., the Bi-level Cloud Pricing Optimization Problem (BCPOP). After a series of experiments that consisted in training heuristics on various lower-level instances of the BCPOP and using them to tackle the bi-level problem itself, the obtained results are compared to the ``cooperative coevolutionary algorithm for bi-level optimization'' (COBRA). Although training heuristics enables to \emph{break the nested structure}, a two phase optimization is still required. Therefore, the emphasis has been put on training heuristics while optimizing the upper-level problem using competitive co-evolution. Instead of adopting the classical decomposition scheme as done by COBRA which suffers from the strong epistatic links between lower-level and upper-level variables, co-evolving the solution and the mean to get to it can cope with these epistatic link issues. The ``CARBON'' algorithm developed in this thesis is a competitive and hybrid co-evolutionary algorithm designed for this purpose. In order to validate the potential of CARBON, numerical experiments have been designed and results have been compared to state-of-the-art algorithms. These results demonstrate that ``CARBON'' makes possible to address nested optimization efficiently

    Optimization Methods Applied to Power Systems Ⅱ

    Get PDF
    Electrical power systems are complex networks that include a set of electrical components that allow distributing the electricity generated in the conventional and renewable power plants to distribution systems so it can be received by final consumers (businesses and homes). In practice, power system management requires solving different design, operation, and control problems. Bearing in mind that computers are used to solve these complex optimization problems, this book includes some recent contributions to this field that cover a large variety of problems. More specifically, the book includes contributions about topics such as controllers for the frequency response of microgrids, post-contingency overflow analysis, line overloads after line and generation contingences, power quality disturbances, earthing system touch voltages, security-constrained optimal power flow, voltage regulation planning, intermittent generation in power systems, location of partial discharge source in gas-insulated switchgear, electric vehicle charging stations, optimal power flow with photovoltaic generation, hydroelectric plant location selection, cold-thermal-electric integrated energy systems, high-efficiency resonant devices for microwave power generation, security-constrained unit commitment, and economic dispatch problems

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Modelling and Optimizing Supply Chain Integrated Production Scheduling Problems

    Full text link
    Globalization and advanced information technologies (e.g., Internet of Things) have considerably impacted supply chains (SCs) by persistently forcing original equipment manufacturers (OEMs) to switch production strategies from make-to-stock (MTS) to make-to-order (MTO) to survive in competition. Generally, an OEM follows the MTS strategy for products with steady demand. In contrast, the MTO strategy exists under a pull system with irregular demand in which the received customer orders are scheduled and launched into production. In comparison to MTS, MTO has the primary challenges of ensuring timely delivery at the lowest possible cost, satisfying the demands of high customization and guaranteeing the accessibility of raw materials throughout the production process. These challenges are increasing substantially since industrial productions are becoming more flexible, diversified, and customized. Besides, independently making the production scheduling decisions from other stages of these SCs often find sub-optimal results, creating substantial challenges to fulfilling demands timely and cost-effectively. Since adequately managing these challenges asynchronously are difficult, constructing optimization models by integrating SC decisions, such as customer requirements, supply portfolio (supplier selection and order allocation), delivery batching decisions, and inventory portfolio (inventory replenishment, consumption, and availability), with shop floor scheduling under a deterministic and dynamic environment is essential to fulfilling customer expectations at the least possible cost. These optimization models are computationally intractable. Consequently, designing algorithms to schedule or reschedule promptly is also highly challenging for these time-sensitive, operationally integrated optimization models. Thus, this thesis focuses on modelling and optimizing SC-integrated production scheduling problems, named SC scheduling problems (SCSPs). The objective of optimizing job shop scheduling problems (JSSPs) is to ensure that the requisite resources are accessible when required and that their utilization is maximally efficient. Although numerous algorithms have been devised, they can sometimes become computationally exorbitant and yield sub-optimal outcomes, rendering production systems inefficient. These could be due to a variety of causes, such as an imbalance in population quality over generations, recurrent generation and evaluation of identical schedules, and permitting an under-performing method to conduct the evolutionary process. Consequently, this study designs two methods, a sequential approach (Chapter 2) and a multi-method approach (Chapter 3), to address the aforementioned issues and to acquire competitive results in finding optimal or near-optimal solutions for JSSPs in a single objective setting. The devised algorithms for JSSPs optimize workflows for each job by accurate mapping between/among related resources, generating more optimal results than existing algorithms. Production scheduling can not be accomplished precisely without considering supply and delivery decisions and customer requirements simultaneously. Thus, a few recent studies have operationally integrated SCs to accurately predict process insights for executing, monitoring, and controlling the planned production. However, these studies are limited to simple shop-floor configurations and can provide the least flexibility to address the MTO-based SC challenges. Thus, this study formulates a bi-objective optimization model that integrates the supply portfolio into a flexible job shop scheduling environment with a customer-imposed delivery window to cost-effectively meet customized and on-time delivery requirements (Chapter 4). Compared to the job shop that is limited to sequence flexibility only, the flexible job shop has been deemed advantageous due to its capacity to provide increased scheduling flexibility (both process and sequence flexibility). To optimize the model, the performance of the multi-objective particle swarm optimization algorithm has been enhanced, with the results providing decision-makers with an increased degree of flexibility, offering a larger number of Pareto solutions, more varied and consistent frontiers, and a reasonable time for MTO-based SCs. Environmental sustainability is spotlighted for increasing environmental awareness and follow-up regulations. Consequently, the related factors strongly regulate the supply portfolio for sustainable development, which remained unexplored in the SCSP as those criteria are primarily qualitative (e.g., green production, green product design, corporate social responsibility, and waste disposal system). These absences may lead to an unacceptable supply portfolio. Thus, this study overcomes the problem by integrating VIKORSORT into the proposed solution methodology of the extended SCSP. In addition, forming delivery batches of heterogeneous customer orders is challenging, as one order can lead to another being delayed. Therefore, the previous optimization model is extended by integrating supply, manufacturing, and delivery batching decisions and concurrently optimizing them in response to heterogeneous customer requirements with time window constraints, considering both economic and environmental sustainability for the supply portfolio (Chapter 5). Since the proposed optimization model is an extension of the flexible job shop, it can be classified as a non-deterministic polynomial-time (NP)-hard problem, which cannot be solved by conventional optimization techniques, particularly in the case of larger instances. Therefore, a reinforcement learning-based hyper-heuristic (HH) has been designed, where four solution-updating heuristics are intelligently guided to deliver the best possible results compared to existing algorithms. The optimization model furnishes a set of comprehensive schedules that integrate the supply portfolio, production portfolio (work-center/machine assignment and customer orders sequencing), and batching decisions. This provides numerous meaningful managerial insights and operational flexibility prior to the execution phase. Recently, SCs have been experiencing unprecedented and massive disruptions caused by an abrupt outbreak, resulting in difficulties for OEMs to recover from disruptive demand-supply equilibrium. Hence, this study proposes a multi-portfolio (supply, production, and inventory portfolios) approach for a proactive-reactive scheme, which concerns the SCSP with complex multi-level products, simultaneously including unpredictably dynamic supply, demand, and shop floor disruptions (Chapter 6). This study considers fabrication and assembly in a multi-level product structure. To effectively address this time-sensitive model based on real-time data, a Q-learning-based multi-operator differential evolution algorithm in a HH has been designed to address disruptive events and generate a timely rescheduling plan. The numerical results and analyses demonstrate the proposed model's capability to effectively address single and multiple disruptions, thus providing significant managerial insights and ensuring SC resilience

    Hybrid machine learning approaches for scene understanding: From segmentation and recognition to image parsing

    Get PDF
    We alleviate the problem of semantic scene understanding by studies on object segmentation/recognition and scene labeling methods respectively. We propose new techniques for joint recognition, segmentation and pose estimation of infrared (IR) targets. The problem is formulated in a probabilistic level set framework where a shape constrained generative model is used to provide a multi-class and multi-view shape prior and where the shape model involves a couplet of view and identity manifolds (CVIM). A level set energy function is then iteratively optimized under the shape constraints provided by the CVIM. Since both the view and identity variables are expressed explicitly in the objective function, this approach naturally accomplishes recognition, segmentation and pose estimation as joint products of the optimization process. For realistic target chips, we solve the resulting multi-modal optimization problem by adopting a particle swarm optimization (PSO) algorithm and then improve the computational efficiency by implementing a gradient-boosted PSO (GB-PSO). Evaluation was performed using the Military Sensing Information Analysis Center (SENSIAC) ATR database, and experimental results show that both of the PSO algorithms reduce the cost of shape matching during CVIM-based shape inference. Particularly, GB-PSO outperforms other recent ATR algorithms, which require intensive shape matching, either explicitly (with pre-segmentation) or implicitly (without pre-segmentation). On the other hand, under situations when target boundaries are not obviously observed and object shapes are not preferably detected, we explored some sparse representation classification (SRC) methods on ATR applications, and developed a fusion technique that combines the traditional SRC and a group constrained SRC algorithm regulated by a sparsity concentration index for improved classification accuracy on the Comanche dataset. Moreover, we present a compact rare class-oriented scene labeling framework (RCSL) with a global scene assisted rare class retrieval process, where the retrieved subset was expanded by choosing scene regulated rare class patches. A complementary rare class balanced CNN is learned to alleviate imbalanced data distribution problem at lower cost. A superpixels-based re-segmentation was implemented to produce more perceptually meaningful object boundaries. Quantitative results demonstrate the promising performances of proposed framework on both pixel and class accuracy for scene labeling on the SIFTflow dataset, especially for rare class objects
    corecore