11 research outputs found

    Optimal uncertainty quantification for legacy data observations of Lipschitz functions

    Get PDF
    We consider the problem of providing optimal uncertainty quantification (UQ) --- and hence rigorous certification --- for partially-observed functions. We present a UQ framework within which the observations may be small or large in number, and need not carry information about the probability distribution of the system in operation. The UQ objectives are posed as optimization problems, the solutions of which are optimal bounds on the quantities of interest; we consider two typical settings, namely parameter sensitivities (McDiarmid diameters) and output deviation (or failure) probabilities. The solutions of these optimization problems depend non-trivially (even non-monotonically and discontinuously) upon the specified legacy data. Furthermore, the extreme values are often determined by only a few members of the data set; in our principal physically-motivated example, the bounds are determined by just 2 out of 32 data points, and the remainder carry no information and could be neglected without changing the final answer. We propose an analogue of the simplex algorithm from linear programming that uses these observations to offer efficient and rigorous UQ for high-dimensional systems with high-cardinality legacy data. These findings suggest natural methods for selecting optimal (maximally informative) next experiments.Comment: 38 page

    Diagnosis of dyslexia with low quality data with genetic fuzzy systems

    Get PDF
    AbstractFor diagnosing dyslexia in early childhood, children have to solve non-writing based graphical tests. Human experts score these tests, and decide whether the children require further consideration on the basis of their marks.Applying artificial intelligence techniques for automating this scoring is a complex task with multiple sources of uncertainty. On the one hand, there are conflicts, as different experts can assign different scores to the same set of answers. On the other hand, sometimes the experts are not completely confident with their decisions and doubt between different scores. The problem is aggravated because certain symptoms are compatible with more than one disorder. In case of doubt, the experts assign an interval-valued score to the test and ask for further information about the child before diagnosing him.Having said that, exploiting the information in uncertain datasets has been recently acknowledged as a new challenge in genetic fuzzy systems. In this paper we propose using a genetic cooperative–competitive algorithm for designing a linguistically understandable, rule-based classifier that can tackle this problem. This algorithm is part of a web-based, automated pre-screening application that can be used by the parents for detecting those symptoms that advise taking the children to a psychologist for an individual examination

    Genetic algorithms for condition-based maintenance optimization under uncertainty

    Get PDF
    International audienceThis paper proposes and compares different techniques for maintenance optimization based on Genetic Algorithms (GA), when the parameters of the maintenance model are affected by uncertainty and the fitness values are represented by Cumulative Distribution Functions (CDFs). The main issues addressed to tackle this problem are the development of a method to rank the uncertain fitness values, and the definition of a novel Pareto dominance concept. The GA-based methods are applied to a practical case study concerning the setting of a condition-based maintenance policy on the degrading nozzles of a gas turbine operated in an energy production plant

    Optimization-based decision-making models for disaster recovery and reconstruction planning of transportation networks

    Get PDF
    The purpose of this study is to analyze optimization-based decision-making models for the problem of Disaster Recovery Planning of Transportation Networks (DRPTN). In the past three decades, seminal optimization problems have been structured and solved for the critical and sensitive problem of DRPTN. The extent of our knowledge on the practicality of the methods and performance of results is however limited. To evaluate the applicability of those context-sensitive models in real-world situations, there is a need to examine the conceptual and technical structure behind the existing body of work. To this end, this paper performs a systematic search targeting DRPTN publications. Thereafter, we review the identified literature based on the four phases of the optimization-based decision-making modeling process as problem definition, problem formulation, problem-solving, and model validation. Then, through content analysis and descriptive statistics, we investigate the methodology of studies within each of these phases. Eventually, we detect and discuss four research improvement areas as [1] developing conceptual or systematic decision support in the selection of decision attributes and problem structuring, [2] integrating recovery problems with traffic management models, [3] avoiding uncertainty due to the type of solving algorithms, and [4] reducing subjectivity in the validation process of disaster recovery models. Finally, we provide suggestions as well as possible directions for future research.TU Berlin, Open-Access-Mittel - 202

    Comparing Solutions under Uncertainty in Multiobjective Optimization

    Get PDF
    Due to various reasons the solutions in real-world optimization problems cannot always be exactly evaluated but are sometimes represented with approximated values and confidence intervals. In order to address this issue, the comparison of solutions has to be done differently than for exactly evaluated solutions. In this paper, we define new relations under uncertainty between solutions in multiobjective optimization that are represented with approximated values and confidence intervals. The new relations extend the Pareto dominance relations, can handle constraints, and can be used to compare solutions, both with and without the confidence interval. We also show that by including confidence intervals into the comparisons, the possibility of incorrect comparisons, due to inaccurate approximations, is reduced. Without considering confidence intervals, the comparison of inaccurately approximated solutions can result in the promising solutions being rejected and the worse ones preserved. The effect of new relations in the comparison of solutions in a multiobjective optimization algorithm is also demonstrated

    APPROXIMATION ASSISTED MULTIOBJECTIVE AND COLLABORATIVE ROBUST OPTIMIZATION UNDER INTERVAL UNCERTAINTY

    Get PDF
    Optimization of engineering systems under uncertainty often involves problems that have multiple objectives, constraints and subsystems. The main goal in these problems is to obtain solutions that are optimum and relatively insensitive to uncertainty. Such solutions are called robust optimum solutions. Two classes of such problems are considered in this dissertation. The first class involves Multi-Objective Robust Optimization (MORO) problems under interval uncertainty. In this class, an entire system optimization problem, which has multiple nonlinear objectives and constraints, is solved by a multiobjective optimizer at one level while robustness of trial alternatives generated by the optimizer is evaluated at the other level. This bi-level (or nested) MORO approach can become computationally prohibitive as the size of the problem grows. To address this difficulty, a new and improved MORO approach under interval uncertainty is developed. Unlike the previously reported bi-level MORO methods, the improved MORO performs robustness evaluation only for optimum solutions and uses this information to iteratively shrink the feasible domain and find the location of robust optimum solutions. Compared to the previous bi-level approach, the improved MORO significantly reduces the number of function calls needed to arrive at the solutions. To further improve the computational cost, the improved MORO is combined with an online approximation approach. This new approach is called Approximation-Assisted MORO or AA-MORO. The second class involves Multiobjective collaborative Robust Optimization (McRO) problems. In this class, an entire system optimization problem is decomposed hierarchically along user-defined domain specific boundaries into system optimization problem and several subsystem optimization subproblems. The dissertation presents a new Approximation-Assisted McRO (AA-McRO) approach under interval uncertainty. AA-McRO uses a single-objective optimization problem to coordinate all system and subsystem optimization problems in a Collaborative Optimization (CO) framework. The approach converts the consistency constraints of CO into penalty terms which are integrated into the subsystem objective functions. In this way, AA-McRO is able to explore the design space and obtain optimum design solutions more efficiently compared to a previously reported McRO. Both AA-MORO and AA-McRO approaches are demonstrated with a variety of numerical and engineering optimization examples. It is found that the solutions from both approaches compare well with the previously reported approaches but require a significantly less computational cost. Finally, the AA-MORO has been used in the development of a decision support system for a refinery case study in order to facilitate the integration of engineering and business decisions using an agent-based approach

    Multi objective evolutionary optimization in uncertain environments

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Dependable Embedded Systems

    Get PDF
    This Open Access book introduces readers to many new techniques for enhancing and optimizing reliability in embedded systems, which have emerged particularly within the last five years. This book introduces the most prominent reliability concerns from today’s points of view and roughly recapitulates the progress in the community so far. Unlike other books that focus on a single abstraction level such circuit level or system level alone, the focus of this book is to deal with the different reliability challenges across different levels starting from the physical level all the way to the system level (cross-layer approaches). The book aims at demonstrating how new hardware/software co-design solution can be proposed to ef-fectively mitigate reliability degradation such as transistor aging, processor variation, temperature effects, soft errors, etc. Provides readers with latest insights into novel, cross-layer methods and models with respect to dependability of embedded systems; Describes cross-layer approaches that can leverage reliability through techniques that are pro-actively designed with respect to techniques at other layers; Explains run-time adaptation and concepts/means of self-organization, in order to achieve error resiliency in complex, future many core systems
    corecore