85 research outputs found

    Computing the decomposable entropy of belief-function graphical models

    Get PDF
    In 2018, Jiroušek and Shenoy proposed a definition of entropy for Dempster-Shafer (D-S) belief functions called decomposable entropy (d-entropy). This paper provides an algorithm for computing the d-entropy of directed graphical D-S belief function models. We illustrate the algorithm using Almond's Captain's Problem example. For belief function undirected graphical models, assuming that the set of belief functions in the model is non-informative, the belief functions are distinct. We illustrate this using Haenni-Lehmann's Communication Network problem. As the joint belief function for this model is quasi-consonant, it follows from a property of d-entropy that the d-entropy of this model is zero, and no algorithm is required. For a class of undirected graphical models, we provide an algorithm for computing the d-entropy of such models. Finally, the d-entropy coincides with Shannon's entropy for the probability mass function of a single random variable and for a large multi-dimensional probability distribution expressed as a directed acyclic graph model called a Bayesian network. We illustrate this using Lauritzen-Spiegelhalter's Chest Clinic example represented as a belief-function directed graphical model

    ON THE RATIONAL SCOPE OF PROBABILISTIC RULE-BASED INFERENCE SYSTEMS

    Get PDF
    Belief updating schemes in artificial intelligence may be viewed as three dimensional languages, consisting of a syntax (e.g. probabilities or certainty factors), a calculus (e.g. Bayesian or CF combination rules), and a semantics (i.e. cognitive interpretations of competing formalisms). This paper studies the rational scope of those languages on the syntax and calculus grounds. In particular, the paper presents an endomorphism theorem which highlights the limitations imposed by the conditional independence assumptions implicit in the CF calculus. Implications of the theorem to the relationship between the CF and the Bayesian languages and the Dempster-Shafer theory of evidence are presented. The paper concludes with a discussion of some implications on rule-based knowledge engineering in uncertain domains.Information Systems Working Papers Serie

    Required mathematical properties and behaviors of uncertainty measures on belief intervals

    Get PDF
    The Dempster–Shafer theory of evidence (DST) has been widely used to handle uncertainty‐based information. It is based on the concept of basic probability assignment (BPA). Belief intervals are easier to manage than a BPA to represent uncertainty‐based information. For this reason, several uncertainty measures for DST recently proposed are based on belief intervals. In this study, we carry out a study about the crucial mathematical properties and behavioral requirements that must be verified by every uncertainty measure on belief intervals. We base on the study previously carried out for uncertainty measures on BPAs. Furthermore, we analyze which of these properties are satisfied by each one of the uncertainty measures on belief intervals proposed so far. Such a comparative analysis shows that, among these measures, the maximum of entropy on the belief intervals is the most suitable one to be employed in practical applications since it is the only one that satisfies all the required mathematical properties and behaviors

    Uncertainty management in assessment of FMEA expert based on negation information and belief entropy

    Get PDF
    The failure mode and effects analysis (FMEA) is a commonly adopted approach in engineering failure analysis, wherein the risk priority number (RPN) is utilized to rank failure modes. However, assessments made by FMEA experts are full of uncertainty. To deal with this issue, we propose a new uncertainty management approach for the assessments given by experts based on negation information and belief entropy in the Dempster–Shafer evidence theory framework. First, the assessments of FMEA experts are modeled as basic probability assignments (BPA) in evidence theory. Next, the negation of BPA is calculated to extract more valuable information from a new perspective of uncertain information. Then, by utilizing the belief entropy, the degree of uncertainty of the negation information is measured to represent the uncertainty of different risk factors in the RPN. Finally, the new RPN value of each failure mode is calculated for the ranking of each FMEA item in risk analysis. The rationality and effectiveness of the proposed method is verified through its application in a risk analysis conducted for an aircraft turbine rotor blade

    Possibility expectation and its decision making algorithm

    Get PDF
    The fuzzy integral has been shown to be an effective tool for the aggregation of evidence in decision making. Of primary importance in the development of a fuzzy integral pattern recognition algorithm is the choice (construction) of the measure which embodies the importance of subsets of sources of evidence. Sugeno fuzzy measures have received the most attention due to the recursive nature of the fabrication of the measure on nested sequences of subsets. Possibility measures exhibit an even simpler generation capability, but usually require that one of the sources of information possess complete credibility. In real applications, such normalization may not be possible, or even desirable. In this report, both the theory and a decision making algorithm for a variation of the fuzzy integral are presented. This integral is based on a possibility measure where it is not required that the measure of the universe be unity. A training algorithm for the possibility densities in a pattern recognition application is also presented with the results demonstrated on the shuttle-earth-space training and testing images

    A method of classification for multisource data in remote sensing based on interval-valued probabilities

    Get PDF
    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method
    corecore