173 research outputs found

    Applications of Belief Functions in Business Decisions: A Review

    Get PDF
    This is the author's final draft. The publisher's official version is available from: .In this paper, we review recent applications of Dempster-Shafer theory (DST) of belief functions to auditing and business decision-making. We show how DST can better map uncertainties in the application domains than Bayesian theory of probabilities. We review the applications in auditing around three practical problems that challenge the effective application of DST, namely, hierarchical evidence, versatile evidence, and statistical evidence. We review the applications in other business decisions in two loose categories: judgment under ambiguity and business model combination. Finally, we show how the theory of linear belief functions, a new extension of DST, can provide an alternative solution to a wide range of business problems

    An Introduction to Evidential Reasoning for Decision Making under Uncertainty: Bayesian and Belief Functions Perspectives

    Get PDF
    The main purpose of this article is to introduce the evidential reasoning approach, a research methodology, for decision making under uncertainty. Bayesian framework and Dempster-Shafer theory of belief functions are used to model uncertainties in the decision problem. We first introduce the basics of the DS theory and then discuss the evidential reasoning approach and related concepts. Next, we demonstrate how specific decision models can be developed from the basic evidential diagrams under the two frameworks. It is interesting to note that it is quite efficient to develop Bayesian models of the decision problems using the evidential reasoning approach compared to using the ladder diagram approach as used in the auditing literature. In addition, we compare the decision models developed in this paper with similar models developed in the literature

    A mathematical theory of evidence for G.L.S. Shackle

    Get PDF
    Evidence Theory is a branch of mathematics that concerns the combination of empirical evidence in an individual's mind in order to construct a coherent picture of reality. Designed to deal with unexpected empirical evidence suggesting new possibilities, evidence theory has a lot in common with Shackle's idea of decision-making as a creative act. This essay investigates this connection in detail, pointing to the usefulness of evidence theory to formalise and extend Shackle's decision theory. In order to ease a proper framing of the issues involved, evidence theory is not only compared with Shackle's ideas but also with additive and sub-additive probability theories. Furthermore, the presentation of evidence theory does not refer to the original version only, but takes account of its most recent developments, too.

    The Combination of Paradoxical, Uncertain, and Imprecise Sources of Information based on DSmT and Neutro-Fuzzy Inference

    Full text link
    The management and combination of uncertain, imprecise, fuzzy and even paradoxical or high conflicting sources of information has always been, and still remains today, of primal importance for the development of reliable modern information systems involving artificial reasoning. In this chapter, we present a survey of our recent theory of plausible and paradoxical reasoning, known as Dezert-Smarandache Theory (DSmT) in the literature, developed for dealing with imprecise, uncertain and paradoxical sources of information. We focus our presentation here rather on the foundations of DSmT, and on the two important new rules of combination, than on browsing specific applications of DSmT available in literature. Several simple examples are given throughout the presentation to show the efficiency and the generality of this new approach. The last part of this chapter concerns the presentation of the neutrosophic logic, the neutro-fuzzy inference and its connection with DSmT. Fuzzy logic and neutrosophic logic are useful tools in decision making after fusioning the information using the DSm hybrid rule of combination of masses.Comment: 20 page

    Application of knowledge-driven spatial modelling approaches and uncertainty management to a study of Rift Valley fever in Africa

    Get PDF
    BACKGROUND: There are few studies that have investigated uncertainties surrounding the scientific community's knowledge of the geographical distribution of major animal diseases. This is particularly relevant to Rift Valley fever (RVF), a zoonotic disease causing destructive outbreaks in livestock and man, as the geographical range of the disease is widening to involve previously unaffected regions. In the current study we investigate the application of methods developed in the decision sciences: multiple criteria decision making using weighted linear combination and ordered weighted averages, and Dempster-Shafer theory, implemented within the geographical information system IDRISI, to obtain a greater understanding of uncertainty related to the geographical distribution of RVF. The focus is on presenting alternate methods where extensive field data are not available and traditional, model-based approaches to disease mapping are impossible to conduct. RESULTS: Using a compensatory multiple criteria decision making model based on weighted linear combination, most of sub-Saharan Africa was suitable for endemic circulation of RVF. In contrast, areas where rivers and lakes traversed semi-arid regions, such as those bordering the Sahara, were highly suitable for RVF epidemics and wet, tropical areas of central Africa had low suitability. Using a moderately non-compensatory model based on ordered weighted averages, the areas considered suitable for endemic and epidemic RVF were more restricted. Varying the relative weights of the different factors in the models did not affect suitability estimates to a large degree, but variations in model structure had a large impact on our suitability estimates. Our Dempster-Shafer analysis supported the belief that a range of semi-arid areas were suitable for RVF epidemics and the plausibility that many other areas of the continent were suitable. Areas where high levels of uncertainty were highlighted included the Ethiopian Highlands, southwest Kenya and parts of West Africa. CONCLUSION: We have demonstrated the potential of methods developed in the decision sciences to improve our understanding of uncertainties surrounding the geographical distribution of animal diseases, particularly where information is sparse, and encourage wider application of the decision science methodology in the field of animal health

    A methodology for the selection of a paradigm of reasoning under uncertainty in expert system development

    Get PDF
    The aim of this thesis is to develop a methodology for the selection of a paradigm of reasoning under uncertainty for the expert system developer. This is important since practical information on how to select a paradigm of reasoning under uncertainty is not generally available. The thesis explores the role of uncertainty in an expert system and considers the process of reasoning under uncertainty. The possible sources of uncertainty are investigated and prove to be crucial to some aspects of the methodology. A variety of Uncertainty Management Techniques (UMTs) are considered, including numeric, symbolic and hybrid methods. Considerably more information is found in the literature on numeric methods, than the latter two. Methods that have been proposed for comparing UMTs are studied and comparisons reported in the literature are summarised. Again this concentrates on numeric methods, since there is more literature available. The requirements of a methodology for the selection of a UMT are considered. A manual approach to the selection process is developed. The possibility of extending the boundaries of knowledge stored in the expert system by including meta-data to describe the handling of uncertainty in an expert system is then considered. This is followed by suggestions taken from the literature for automating the process of selection. Finally consideration is given to whether the objectives of the research have been met and recommendations are made for the next stage in researching a methodology for the selection of a paradigm of reasoning under uncertainty in expert system development

    Trial by Traditional Probability, Relative Plausibility, or Belief Function?

    Get PDF
    Almost incredible is that no one has ever formulated an adequate model for applying the standard of proof. What does the law call for? The usual formulation is that the factfinder must roughly test the finding on a scale of likelihood. So, the finding in a civil case must at least be more likely than not or, for the theoretically adventuresome, more than fifty percent probable. Yet everyone concedes that this formulation captures neither how human factfinders actually work nor, more surprisingly, how theory tells us that factfinders should work. An emerging notion that the factfinder should compare the plaintiff’s story to the defendant’s story might be a step forward, but this relative plausibility conjecture has its problems. I contend instead that the mathematical theory of belief functions provides an alternative without those problems, and that the law in fact conforms to this theory. Under it, the standards of proof reveal themselves as instructions for the factfinder to compare the affirmative belief in the finding to any belief in its contradiction, but only after setting aside the range of belief that imperfect evidence leaves uncommitted. Accordingly, rather than requiring a civil case’s elements to exceed fifty percent or comparing best stories, belief functions focus on whether the perhaps smallish imprecise belief exceeds its smallish imprecise contradiction. Belief functions extend easily to the other standards of proof. Moreover, belief functions nicely clarify the workings of burdens of persuasion and production

    I believe it's possible it might be so.... : Exploiting Lexical Clues for the Automatic Generation of Evidentiality Weights for Information Extracted from English Text

    Get PDF
    Information formulated in natural language is being created at an incredible pace, far more quickly than we can make sense of it. Thus, computer algorithms for various kinds of text analytics have been developed to try to find nuggets of new, pertinent and useful information. However, information extracted from text is not always credible or reliable; often buried in sentences are lexical and grammatical structures that indicate the uncertainty of the proposition. Such clues include hedges such as modal adverbs and adjectives, as well as hearsay markers, indicators of inference or belief (”mindsay”), and verb forms identifying future actions which may not take place. In this thesis, we demonstrate how analysis of these lexical and grammatical forms of uncertainty can be automatically analyzed to provide a method of determining an evidential weight to the proposition, which can be used to assess the credibility of the information extracted from English text
    • …
    corecore