12,692 research outputs found
Robust design optimisation of dynamical space systems
In this paper we present a novel approach to the optimisation of complex systems affected by epistemic uncertainty when system and uncertainty evolve dynamically with time; we propose a new modelling approach that uses Evidence Theory to capture epistemic uncertainty
A system is considered which is affected by the time during the operational life (failure rate, performance degradation, function degradation, etc.). The goal is to obtain a resilient design: robust with respect to performance variability and reliable against possible partial failures of one or more components.
We propose to enhance the Evidence Network Model (ENM) with time-dependent reliability functions and decompose the problem into subproblems of smaller complexity. Through this decomposition uncertainty quantification of complex systems becomes affordable for a range of real-world applications.
The method is here applied to a simple resource allocation problem where the goal is to optimally position subsystems within a spacecraft [1
Evaluation of elicitation methods to quantify Bayes linear models
The Bayes linear methodology allows decision makers to express their subjective beliefs and adjust these beliefs as observations are made. It is similar in spirit to probabilistic Bayesian approaches, but differs as it uses expectation as its primitive. While substantial work has been carried out in Bayes linear analysis, both in terms of theory development and application, there is little published material on the elicitation of structured expert judgement to quantify models. This paper investigates different methods that could be used by analysts when creating an elicitation process. The theoretical underpinnings of the elicitation methods developed are explored and an evaluation of their use is presented. This work was motivated by, and is a precursor to, an industrial application of Bayes linear modelling of the reliability of defence systems. An illustrative example demonstrates how the methods can be used in practice
Recommended from our members
Reliability Assessment of Legacy Safety-Critical Systems Upgraded with Fault-Tolerant Off-the-Shelf Software
This paper presents a new way of applying Bayesian assessment to systems, which consist of many components. Full Bayesian inference with such systems is problematic, because it is computationally hard and, far more seriously, one needs to specify a multivariate prior distribution with many counterintuitive dependencies between the probabilities of component failures. The approach taken here is one of decomposition. The system is decomposed into partial views of the systems or part thereof with different degrees of detail and then a mechanism of propagating the knowledge obtained with the more refined views back to the coarser views is applied (recalibration of coarse models). The paper describes the recalibration technique and then evaluates the accuracy of recalibrated models numerically on contrived examples using two techniques: u-plot and prequential likelihood, developed by others for software reliability growth models. The results indicate that the recalibrated predictions are often more accurate than the predictions obtained with the less detailed models, although this is not guaranteed. The techniques used to assess the accuracy of the predictions are accurate enough for one to be able to choose the model giving the most accurate prediction
Space Systems Resilience Engineering and Global System Reliability Optimisation Under Imprecision and Epistemic Uncertainty
The paper introduces the concept of design for resilience in the context of space systems engineering and proposes a method to account
for imprecision and epistemic uncertainty. Resilience can be seen as the ability of a system to adjust its functioning prior to, during,
or following changes and disturbances, so that it can sustain required operations under both expected and unexpected conditions.
Mathematically speaking this translates into the attribute of a dynamical system (or time dependent system) to be simultaneously
robust and reliable. However, the quantification of robustness and reliability in the early stage of the design of a space systems is
generally affected by uncertainty that is epistemic in nature. As the design evolves from Phase A down to phase E, the level of
epistemic uncertainty is expected to decrease but still a level of variability can exist in the expected operational conditions and system
requirements. The paper proposes a representation of a complex space system using the so called Evidence Network Models (ENM):
a non-directed (unlike Bayesian network models) network of interconnected nodes where each node represents a subsystem with
associated epistemic uncertainty on system performance and failure probability. Once the reliability and uncertainty on the performance
of the spacecraft are quantified, a design optimisation process is applied to improve resilience and performance. The method is finally
applied to an example of preliminary design of a small satellite in Low Earth Orbit (LEO). The spacecraft is divided in 5 subsystems,
AOCS, TTC, OBDH, Power and Payload. The payload is a simple camera acquiring images at scheduled times. The assumption is
that each component has multiple functionalities and both the performance of the component and the reliability associated to each
functionality are affected by a level of imprecision. The overall performance indicator is the sum of the performance indicators of all
the components
Preliminary space mission design under uncertainty
This paper proposes a way to model uncertainties and to introduce them explicitly in the design process of a preliminary space mission. Traditionally, a system margin approach is used in order to take the min to account. In this paper, Evidence Theory is proposed to crystallise the inherent uncertainties. The design process is then formulated as an optimisation under uncertainties(OUU). Three techniques are proposed to solve the OUU problem: (a) an evolutionary multi-objective approach, (b) a step technique consisting of maximising the belief for different levels of performance, and (c) a clustering method that firstly identifies feasible regions.The three methods are applied to the Bepi Colombo mission and their effectiveness at solving the OUU problem are compared
Uncertainty analysis and sensitivity analysis for multidisciplinary systems design
The objective of this research is to quantify the impact of both aleatory and epistemic uncertainties on performances of multidisciplinary systems. Aleatory uncertainty comes from the inherent uncertain nature and epistemic uncertainty comes from the lack of knowledge. Although intensive research has been conducted on aleatory uncertainty, few studies on epistemic uncertainty have been reported. In this work, the two types of uncertainty are analyzed. Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals. Probabilistic analysis (PA) and interval analysis (IA) are integrated to capture the effect of the two types of uncertainty. The First Order Reliability Method is employed for PA while nonlinear optimization is used for IA. The unified uncertainty analysis, which consists of PA and IA, is employed to develop new sensitivity analysis methods for the mixture of the two types of uncertainty. The methods are able to quantify the contribution of each input variable with either epistemic uncertainty or aleatory uncertainty. The analysis results can then help better decision making on how to effectively mitigate the effect of uncertainty. The other major contribution of this research is the extension of the unified uncertainty analysis to the reliability analysis for multidisciplinary systems --Abstract, page iv
- …