564 research outputs found

    Phased mission system design optimisation using genetic algorithms

    Get PDF
    A phased mission system represents a system whose performance is divided into consecutive non-overlapping phases. It is important to ensure safety of a phased mission system since the failure of it can have both life threatening and financial consequences. The focus of this paper is to develop an optimisation method to construct an optimal design case for a phased mission system, with the aim of minimising its unreliability and at the same time ensuring optimal usage of available resources throughout all phases. The introduced phased mission optimisation is represented as the constrained single objective problem. Here failure of the overall mission is the objective function and the introduced constraints are employed to determine the optimal use of resources. The implemented optimisation method employs Fault Tree Analysis to represent system performance and Binary Decision Diagrams to quantify each phase failure probability. A single objective Genetic Algorithm has been chosen as the optimisation technique. An Unmanned Aerial Vehicle mission has been selected to demonstrate the methods application. The results and the influence of modifications to the optimisation algorithm are discussed

    Genetic algorithms for condition-based maintenance optimization under uncertainty

    Get PDF
    International audienceThis paper proposes and compares different techniques for maintenance optimization based on Genetic Algorithms (GA), when the parameters of the maintenance model are affected by uncertainty and the fitness values are represented by Cumulative Distribution Functions (CDFs). The main issues addressed to tackle this problem are the development of a method to rank the uncertain fitness values, and the definition of a novel Pareto dominance concept. The GA-based methods are applied to a practical case study concerning the setting of a condition-based maintenance policy on the degrading nozzles of a gas turbine operated in an energy production plant

    Role of Evolutionary Algorithms in Construction Projects Scheduling

    Get PDF
    Due to the increase in the stakeholders and their objectives the construction projects have significantly been affected by the ongoing demands leading to increase in complexity of scheduling problems, research in the field of Multi-Objective Optimization (MOO) have increased significantly. Through their population-based search methodologies, Evolutionary Algorithms drove attention to their efficiency in addressing scheduling problems involving two or three objectives. Genetic Algorithms (GA) particularly have been used in most of the construction optimization problems due to their ability to provide near-optimal Pareto solutions in a reasonable amount of time for almost all objectives. However, when optimizing more than three objectives, the efficiency of such algorithms degrades and trade-offs among conflicting objectives must be made to obtain an optimal Pareto Frontier. To address that, this paper aims to provide a comparative analysis on four evolutionary algorithms (Genetic algorithms – Memetic algorithms – Particle Swarm – Ant colony) in the field of construction scheduling optimization, gaps are addressed, and recommendations are proposed for future research development

    Maintenance optimization in industry 4.0

    Get PDF
    This work reviews maintenance optimization from different and complementary points of view. Specifically, we systematically analyze the knowledge, information and data that can be exploited for maintenance optimization within the Industry 4.0 paradigm. Then, the possible objectives of the optimization are critically discussed, together with the maintenance features to be optimized, such as maintenance periods and degradation thresholds. The main challenges and trends of maintenance optimization are, then, highlighted and the need is identified for methods that do not require a-priori selection of a predefined maintenance strategy, are able to deal with large amounts of heterogeneous data collected from different sources, can properly treat all the uncertainties affecting the behavior of the systems and the environment, and can jointly consider multiple optimization objectives, including the emerging ones related to sustainability and resilience

    Genetic Algorithm-based Wrapper Approach for Grouping Condition Monitoring Signal of Nuclear Power Plant Components

    No full text
    Equipment condition monitoring of nuclear power plants requires to optimally group the usually very large number of signals and to develop for each identified group a separate condition monitoring model. In this paper we propose an approach to optimally group the signals. We use a Genetic Algorithm (GA) for the optimization of the groups; the decision variables of the optimization problem relate to the composition of the groups (i.e., which signals they contain) and the objective function (fitness) driving the search for the optimal grouping is constructed in terms of quantitative indicators of the performances of the condition monitoring models themselves: in this sense, the GA search engine is a wrapper around the condition monitoring models. A real case study is considered, concerning the condition monitoring of the Reactor Coolant Pump (RCP) of a Pressurized Water Reactor (PWR). The optimization results are evaluated with respect to the accuracy and robustness of the monitored signals estimates. The condition monitoring models built on the groups found by the proposed approach outperform the model which uses all available signals, whereas they perform similarly to the models built on groups based on signal correlation. However, these latter do not guarantee the robustness of the reconstruction in case of abnormal conditions and require to a priori fix characteristics of the groups, such as the desired minimum correlation value in a group

    FPGA acceleration of sequence analysis tools in bioinformatics

    Full text link
    Thesis (Ph.D.)--Boston UniversityWith advances in biotechnology and computing power, biological data are being produced at an exceptional rate. The purpose of this study is to analyze the application of FPGAs to accelerate high impact production biosequence analysis tools. Compared with other alternatives, FPGAs offer huge compute power, lower power consumption, and reasonable flexibility. BLAST has become the de facto standard in bioinformatic approximate string matching and so its acceleration is of fundamental importance. It is a complex highly-optimized system, consisting of tens of thousands of lines of code and a large number of heuristics. Our idea is to emulate the main phases of its algorithm on FPGA. Utilizing our FPGA engine, we quickly reduce the size of the database to a small fraction, and then use the original code to process the query. Using a standard FPGA-based system, we achieved 12x speedup over a highly optimized multithread reference code. Multiple Sequence Alignment (MSA)--the extension of pairwise Sequence Alignment to multiple Sequences--is critical to solve many biological problems. Previous attempts to accelerate Clustal-W, the most commonly used MSA code, have directly mapped a portion of the code to the FPGA. We use a new approach: we apply prefiltering of the kind commonly used in BLAST to perform the initial all-pairs alignments. This results in a speedup of from 8Ox to 190x over the CPU code (8 cores). The quality is comparable to the original according to a commonly used benchmark suite evaluated with respect to multiple distance metrics. The challenge in FPGA-based acceleration is finding a suitable application mapping. Unfortunately many software heuristics do not fall into this category and so other methods must be applied. One is restructuring: an entirely new algorithm is applied. Another is to analyze application utilization and develop accuracy/performance tradeoffs. Using our prefiltering approach and novel FPGA programming models we have achieved significant speedup over reference programs. We have applied approximation, seeding, and filtering to this end. The bulk of this study is to introduce the pros and cons of these acceleration models for biosequence analysis tools

    Multi-objective System Design Optimization via PPA and a Fuzzy Method

    Get PDF
    System design deals with various challenges of targets and resources, such as reliability, availability, maintainability, cost, weight, volume, and configuration. This paper deals with the multi-objective system availability and cost optimization of parallel–series systems by resorting to the multi-objective strawberry algorithm also known as the Plant Propagation Algorithm or PPA and a fuzzy method. It is the first implementation of this optimization algorithm in the literature for this kind of problem to generate the Pareto Front. The fuzzy method allows helping the decision maker to select the best compromise solution. A numerical case study involving 10 subsystems highlights the applicability of the proposed approach

    Reliability optimization of hardware components and system´s topology during early design phase

    Get PDF
    To master the complexity in modern vehicle, Original Equipment Manufactures (OEM) attempt to integrate as many functions as possible into the given Electronic Control Unit (ECU), sensors, and actuators without degrading the safety and comfort functionalities. Furthermore scalability, versatility, and performance of products are key to success of electronic development in new modern vehicles. Various functional and nonfunctional requirements obviously shall be fulfilled during development of such complex systems. Choosing of hardware design structure and determination of hardware characteristics are the initial steps during early design phase. The conventional methods for selection of hardware components and topologies are mostly functional-driven. Conventional approaches are largely lacking in versatility and scalability. Due to innovative and complex trend of mechatronic product development, new approaches for hardware decision must be available which support the designers in case of changing (growing) customer demands. One of most important customer requirement for a complex system is reliability. The need for more reliable system design drives up the cost of design and influences the other system characteristics such as weight, power consumption, size, etc. These design goals like reliability, cost potentially impose conflicting requirements on the technical and economic performance of a system design. Hence, visualization and evaluating of the conflicting design preferences and early choosing optimal design are one of the most critical issues during design stage. Many multi-objective optimization approaches have been proposed to tackle this challenge. This dissertation proposes an efficient reliability optimization framework which aids the designers to determine the optimal hardware topology with optimal set of components under known technical and financial restrictions. The proposed reliability optimization framework allows describing the hardware structure of a complex system by a System Reliability Matrix (SRM) and the failure rate vector of involving hardware components. The reliability characteristics of components and the redundancy policy can be varied automatically via the SRM and its corresponding failure rate vector in order to determine optimal solutions. The proposed methodology ultimately addresses the most efficient system architecture (topology) and ascertain the unknown reliability characteristics of hardware components under consideration of financial and technical constraints. It is to be noted that the numerical deterministic search methods and genetic algorithms are applied to optimize the defined objective function under multiple constraints (reliability, cost, weight, size, etc.) and to determine the reliability characteristics of components. A general enumerative algorithm generates all design architectures (topologies) and filters the feasible design architectures (topologies) based on given constraints like budget and etc

    Statistical assessment on Non-cooperative Target Recognition using the Neyman-Pearson statistical test

    Get PDF
    Electromagnetic simulations of a X-target were performed in order to obtain its Radar Cross Section (RCS) for several positions and frequencies. The software used is the CST MWS©. A 1 : 5 scale model of the proposed aircraft was created in CATIA© V5 R19 and imported directly into the CST MWS© environment. Simulations on the X-band were made with a variable mesh size due to a considerable wavelength variation. It is intended to evaluate the Neyman-Pearson (NP) simple hypothesis test performance by analyzing its Receiver Operating Characteristics (ROCs) for two different radar detection scenarios - a Radar Absorbent Material (RAM) coated model, and a Perfect Electric Conductor (PEC) model for recognition purposes. In parallel the radar range equation is used to estimate the maximum range detection for the simulated RAM coated cases to compare their shielding effectiveness (SE) and its consequent impact on recognition. The AN/APG-68(V)9’s airborne radar specifications were used to compute these ranges and to simulate an airborne hostile interception for a Non-Cooperative Target Recognition (NCTR) environment. Statistical results showed weak recognition performances using the Neyman-Pearson (NP) statistical test. Nevertheless, good RCS reductions for most of the simulated positions were obtained reflecting in a 50:9% maximum range detection gain for the PAniCo RAM coating, abiding with experimental results taken from the reviewed literature. The best SE was verified for the PAniCo and CFC-Fe RAMs.Simulações electromagnéticas do alvo foram realizadas de modo a obter a assinatura radar (RCS) para várias posições e frequências. O software utilizado é o CST MWS©. O modelo proposto à escala 1:5 foi modelado em CATIA© V5 R19 e importado diretamente para o ambiente de trabalho CST MWS©. Foram efectuadas simulações na banda X com uma malha de tamanho variável devido à considerável variação do comprimento de onda. Pretende-se avaliar estatisticamente o teste de decisão simples de Neyman-Pearson (NP), analisando as Características de Operação do Receptor (ROCs) para dois cenários de detecção distintos - um modelo revestido com material absorvente (RAM), e outro sendo um condutor perfeito (PEC) para fins de detecção. Em paralelo, a equação de alcance para radares foi usada para estimar o alcance máximo de detecção para ambos os casos de modo a comparar a eficiência de blindagem electromagnética (SE) entre os diferentes revestimentos. As especificações do radar AN/APG-68(V)9 do F-16 foram usadas para calcular os alcances para cada material, simulando uma intercepção hostil num ambiente de reconhecimento de alvos não-cooperativos (NCTR). Os resultados mostram performances de detecção fracas usando o teste de decisão simples de Neyman-Pearson como detector e uma boa redução de RCS para todas as posições na gama de frequências selecionada. Um ganho de alcance de detecção máximo 50:9 % foi obtido para o RAM PAniCo, estando de acordo com os resultados experimentais da bibliografia estudada. Já a melhor SE foi verificada para o RAM CFC-Fe e PAniCo

    Robust multi-objective optimization of safety barriers performance parameters for NaTech scenarios risk assessment and management

    Get PDF
    Safety barriers are to be designed to bring the largest benefit in terms of accidental scenarios consequences mitigation at the most reasonable cost. In this paper, we formulate the problem of the identification of the optimal performance parameters of the barriers that can at the same time allow for the consequences mitigation of Natural Technological (NaTech) accidental scenarios at reasonable cost as a Multi-Objective Optimization (MOO) problem. The MOO is solved for a case study of literature, consisting in a chemical facility composed by three tanks filled with flammable substances and equipped with six safety barriers (active, passive and procedural), exposed to NaTech scenarios triggered by either severe floods or earthquakes. The performance of the barriers is evaluated by a phenomenological dynamic model that mimics the realistic response of the system. The uncertainty of the relevant parameters of the model (i.e., the response time of active and procedural barriers and the effectiveness of the barriers) is accounted for in the optimization, to provide robust solutions. Results for this case study suggest that the NaTech risk is optimally managed by improving the performances of four-out-of-six barriers (three active and one passive). Practical guidelines are provided to retrofit the safety barriers design
    • …
    corecore