94,777 research outputs found

    The belief noisy-or model applied to network reliability analysis

    Get PDF
    One difficulty faced in knowledge engineering for Bayesian Network (BN) is the quan-tification step where the Conditional Probability Tables (CPTs) are determined. The number of parameters included in CPTs increases exponentially with the number of parent variables. The most common solution is the application of the so-called canonical gates. The Noisy-OR (NOR) gate, which takes advantage of the independence of causal interactions, provides a logarithmic reduction of the number of parameters required to specify a CPT. In this paper, an extension of NOR model based on the theory of belief functions, named Belief Noisy-OR (BNOR), is proposed. BNOR is capable of dealing with both aleatory and epistemic uncertainty of the network. Compared with NOR, more rich information which is of great value for making decisions can be got when the available knowledge is uncertain. Specially, when there is no epistemic uncertainty, BNOR degrades into NOR. Additionally, different structures of BNOR are presented in this paper in order to meet various needs of engineers. The application of BNOR model on the reliability evaluation problem of networked systems demonstrates its effectiveness

    Applying Bayes linear methods to support reliability procurement decisions

    Get PDF
    Bayesian methods are common in reliability and risk assessment, however, such methods often demand a large amount of specification and can be computationally intensive. Because of this, many practitioners are unable to take advantage of many of the benefits found in a Bayesian-based approach. The Bayes linear methodology is similar in spirit to a Bayesian approach but offers an alternative method of making inferences. Bayes linear methods are based on the use of expected values rather than probabilities, and updating is carried out by linear adjustment rather than by Bayes Theorem. The foundations of the method are very strong, based as they are in work of De Finetti and developed further by Goldstein. A Bayes linear model requires less specification than a corresponding probability model and for a given amount of model building effort, one can model a more complex situation quicker. The Bayes linear methodology has the potential to allow us to build ''broad-brush' models that enable us, for example, to explore different test setups or analysis methods and assess the benefits that they can give. The output a Bayes linear model is viewed as an approximation to 'traditional' probabilistic models. The methodology has been applied to support reliability decision making within a current United Kingdom Ministry of Defence (MOD) procurement project. The reliability decision maker had to assess different contractor bids and assess the reliability merit of each bid. Currently the MOD assess reliability programmes subjectively using expert knowledge - for a number of reasons, a quantitative method of assessment in some projects is desirable. The Bayes linear methodology was used to support the decision maker in quantifying his assessment of the reliability of each contractor's bid and determining the effectiveness of each contractor's reliability programme. From this, the decision maker was able to communicate to the project leader and contractors, why a specific contractor was chosen. The methodology has been used in other MOD projects and is considered by those within the MOD as a useful tool to support decision making. The paper will contain the following. The paper will introduce the Bayes linear methodology and briefly discuss some of the philosophical implications of adopting a Bayes linear methodology within the context of a reliability programme analysis. The paper will briefly introduce the reliability domain and the reasons why it is believed that the Bayes linear methodology can offer support to decision makers. An in-depth analysis of the problem will then be given documenting the steps taken in the project and how future decision makers can apply the methodology. A brief summary will then be given as to possible future work for those interested in the Bayes linear methodology

    Robust Mission Design Through Evidence Theory and Multi-Agent Collaborative Search

    Full text link
    In this paper, the preliminary design of a space mission is approached introducing uncertainties on the design parameters and formulating the resulting reliable design problem as a multiobjective optimization problem. Uncertainties are modelled through evidence theory and the belief, or credibility, in the successful achievement of mission goals is maximised along with the reliability of constraint satisfaction. The multiobjective optimisation problem is solved through a novel algorithm based on the collaboration of a population of agents in search for the set of highly reliable solutions. Two typical problems in mission analysis are used to illustrate the proposed methodology

    Approximated Computation of Belief Functions for Robust Design Optimization

    Get PDF
    This paper presents some ideas to reduce the computational cost of evidence-based robust design optimization. Evidence Theory crystallizes both the aleatory and epistemic uncertainties in the design parameters, providing two quantitative measures, Belief and Plausibility, of the credibility of the computed value of the design budgets. The paper proposes some techniques to compute an approximation of Belief and Plausibility at a cost that is a fraction of the one required for an accurate calculation of the two values. Some simple test cases will show how the proposed techniques scale with the dimension of the problem. Finally a simple example of spacecraft system design is presented.Comment: AIAA-2012-1932 14th AIAA Non-Deterministic Approaches Conference. 23-26 April 2012 Sheraton Waikiki, Honolulu, Hawai

    Metacognition as Evidence for Evidentialism

    Get PDF
    Metacognition is the monitoring and controlling of cognitive processes. I examine the role of metacognition in ‘ordinary retrieval cases’, cases in which it is intuitive that via recollection the subject has a justiïŹed belief. Drawing on psychological research on metacognition, I argue that evidentialism has a unique, accurate prediction in each ordinary retrieval case: the subject has evidence for the proposition she justiïŹedly believes. But, I argue, process reliabilism has no unique, accurate predictions in these cases. I conclude that ordinary retrieval cases better support evidentialism than process reliabilism. This conclusion challenges several common assumptions. One is that non-evidentialism alone allows for a naturalized epistemology, i.e., an epistemology that is fully in accordance with scientiïŹc research and methodology. Another is that process reliabilism fares much better than evidentialism in the epistemology of memory

    A method of classification for multisource data in remote sensing based on interval-valued probabilities

    Get PDF
    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method
    • 

    corecore