7,781 research outputs found

    Uncertainty in Ontologies: Dempster-Shafer Theory for Data Fusion Applications

    Full text link
    Nowadays ontologies present a growing interest in Data Fusion applications. As a matter of fact, the ontologies are seen as a semantic tool for describing and reasoning about sensor data, objects, relations and general domain theories. In addition, uncertainty is perhaps one of the most important characteristics of the data and information handled by Data Fusion. However, the fundamental nature of ontologies implies that ontologies describe only asserted and veracious facts of the world. Different probabilistic, fuzzy and evidential approaches already exist to fill this gap; this paper recaps the most popular tools. However none of the tools meets exactly our purposes. Therefore, we constructed a Dempster-Shafer ontology that can be imported into any specific domain ontology and that enables us to instantiate it in an uncertain manner. We also developed a Java application that enables reasoning about these uncertain ontological instances.Comment: Workshop on Theory of Belief Functions, Brest: France (2010

    Linguistic probability theory

    Get PDF
    In recent years probabilistic knowledge-based systems such as Bayesian networks and influence diagrams have come to the fore as a means of representing and reasoning about complex real-world situations. Although some of the probabilities used in these models may be obtained statistically, where this is impossible or simply inconvenient, modellers rely on expert knowledge. Experts, however, typically find it difficult to specify exact probabilities and conventional representations cannot reflect any uncertainty they may have. In this way, the use of conventional point probabilities can damage the accuracy, robustness and interpretability of acquired models. With these concerns in mind, psychometric researchers have demonstrated that fuzzy numbers are good candidates for representing the inherent vagueness of probability estimates, and the fuzzy community has responded with two distinct theories of fuzzy probabilities.This thesis, however, identifies formal and presentational problems with these theories which render them unable to represent even very simple scenarios. This analysis leads to the development of a novel and intuitively appealing alternative - a theory of linguistic probabilities patterned after the standard Kolmogorov axioms of probability theory. Since fuzzy numbers lack algebraic inverses, the resulting theory is weaker than, but generalises its classical counterpart. Nevertheless, it is demonstrated that analogues for classical probabilistic concepts such as conditional probability and random variables can be constructed. In the classical theory, representation theorems mean that most of the time the distinction between mass/density distributions and probability measures can be ignored. Similar results are proven for linguistic probabiliities.From these results it is shown that directed acyclic graphs annotated with linguistic probabilities (under certain identified conditions) represent systems of linguistic random variables. It is then demonstrated these linguistic Bayesian networks can utilise adapted best-of-breed Bayesian network algorithms (junction tree based inference and Bayes' ball irrelevancy calculation). These algorithms are implemented in ARBOR, an interactive design, editing and querying tool for linguistic Bayesian networks.To explore the applications of these techniques, a realistic example drawn from the domain of forensic statistics is developed. In this domain the knowledge engineering problems cited above are especially pronounced and expert estimates are commonplace. Moreover, robust conclusions are of unusually critical importance. An analysis of the resulting linguistic Bayesian network for assessing evidential support in glass-transfer scenarios highlights the potential utility of the approach

    Construction of optimal prediction intervals for load forecasting problems

    Full text link
    Short-term load forecasting is fundamental for the reliable and efficient operation of power systems. Despite its importance, accurate prediction of loads is problematic and far remote. Often uncertainties significantly degrade performance of load forecasting models. Besides, there is no index available indicating reliability of predicted values. The objective of this study is to construct prediction intervals for future loads instead of forecasting their exact values. The delta technique is applied for constructing prediction intervals for outcomes of neural network models. Some statistical measures are developed for quantitative and comprehensive evaluation of prediction intervals. According to these measures, a new cost function is designed for shortening length of prediction intervals without compromising their coverage probability. Simulated annealing is used for minimization of this cost function and adjustment of neural network parameters. Demonstrated results clearly show that the proposed methods for constructing prediction interval outperforms the traditional delta technique. Besides, it yields prediction intervals that are practically more reliable and useful than exact point predictions. <br /

    Closed-loop feedback computation model of dynamical reputation based on the local trust evaluation in business-to-consumer e-commerce

    Get PDF
    Trust and reputation are important factors that influence the success of both traditional transactions in physical social networks and modern e-commerce in virtual Internet environments. It is difficult to define the concept of trust and quantify it because trust has both subjective and objective characteristics at the same time. A well-reported issue with reputation management system in business-to-consumer (BtoC) e-commerce is the “all good reputation” problem. In order to deal with the confusion, a new computational model of reputation is proposed in this paper. The ratings of each customer are set as basic trust score events. In addition, the time series of massive ratings are aggregated to formulate the sellers’ local temporal trust scores by Beta distribution. A logical model of trust and reputation is established based on the analysis of the dynamical relationship between trust and reputation. As for single goods with repeat transactions, an iterative mathematical model of trust and reputation is established with a closed-loop feedback mechanism. Numerical experiments on repeated transactions recorded over a period of 24 months are performed. The experimental results show that the proposed method plays guiding roles for both theoretical research into trust and reputation and the practical design of reputation systems in BtoC e-commerce

    Fuzzy-Bayesian-network-based Safety Risk Analysis in Railway Passenger Transport

    Get PDF
    This study presents a fuzzy Bayesian network (FBN) method to analyze the influence on the safety risk of railway passenger transport applying different risk control strategies. Based on the fuzzy probability of the basic event determined by the expert group decision method, the proposed FBN method can reasonably predict the probability of railway passenger safety risk. It is also proven that control the risk in the safety management of railway passenger transport will be the most effective way to reduce the risk probability of the railway passenger transport safety

    Platonic model of mind as an approximation to neurodynamics

    Get PDF
    Hierarchy of approximations involved in simplification of microscopic theories, from sub-cellural to the whole brain level, is presented. A new approximation to neural dynamics is described, leading to a Platonic-like model of mind based on psychological spaces. Objects and events in these spaces correspond to quasi-stable states of brain dynamics and may be interpreted from psychological point of view. Platonic model bridges the gap between neurosciences and psychological sciences. Static and dynamic versions of this model are outlined and Feature Space Mapping, a neurofuzzy realization of the static version of Platonic model, described. Categorization experiments with human subjects are analyzed from the neurodynamical and Platonic model points of view

    Bayesian Inference in Processing Experimental Data: Principles and Basic Applications

    Full text link
    This report introduces general ideas and some basic methods of the Bayesian probability theory applied to physics measurements. Our aim is to make the reader familiar, through examples rather than rigorous formalism, with concepts such as: model comparison (including the automatic Ockham's Razor filter provided by the Bayesian approach); parametric inference; quantification of the uncertainty about the value of physical quantities, also taking into account systematic effects; role of marginalization; posterior characterization; predictive distributions; hierarchical modelling and hyperparameters; Gaussian approximation of the posterior and recovery of conventional methods, especially maximum likelihood and chi-square fits under well defined conditions; conjugate priors, transformation invariance and maximum entropy motivated priors; Monte Carlo estimates of expectation, including a short introduction to Markov Chain Monte Carlo methods.Comment: 40 pages, 2 figures, invited paper for Reports on Progress in Physic
    • 

    corecore