88 research outputs found

    An Algebraic Theory for Data Linkage

    Get PDF
    There are countless sources of data available to governments, companies, and citizens, which can be combined for good or evil. We analyse the concepts of combining data from common sources and linking data from different sources. We model the data and its information content to be found in a single source by an ordered partial monoid, and the transfer of information between sources by different types of morphisms. To capture the linkage between a family of sources, we use a form of Grothendieck construction to create an ordered partial monoid that brings together the global data of the family in a single structure. We apply our approach to database theory and axiomatic structures in approximate reasoning. Thus, ordered partial monoids provide a foundation for the algebraic study for information gathering in its most primitive form

    A Probabilistic Framework for Security Scenarios with Dependent Actions

    Get PDF
    This work addresses the growing need of performing meaningful probabilistic analysis of security. We propose a framework that integrates the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. This allows us to perform probabilistic evaluation of attack–defense scenarios involving dependent actions. To improve the efficiency of our computations, we make use of inference algorithms from Bayesian networks and encoding techniques from constraint reasoning. We discuss the algebraic theory underlying our framework and point out several generalizations which are possible thanks to the use of semiring theory

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    On the Notion of Similarity in Case-Based Reasoning

    Get PDF
    The semantics of similarity measures is studied and reduced to the evidence theory of Dempster and Shafer. Applications are given for classification and configuration, the latter uses utility theory in addition

    Special Topics

    No full text

    Contents

    No full text
    Resolution is an often used method for deducation in propositional logic. Here a proper organisation of deduction is proposed which avoids redundant computations. It is based on a generic framework of decompositions and local computations as introduced by Shafer, Shenoy (1990). The system contains the two basic operations with informations, namely marginalization (or projection) and combination ďż˝ the latter being an idempotent operation in the present case. The theory permits the conception of an architecture of distributed computing. As an important application assumptionbase

    An algebraic study of argumentation systems and evidence theory

    No full text
    Argumentation systems permit to nd arguments in favour and against hypotheses. And these hypotheses can be accepted as true or must be refuted as false according to whether the arguments supporting or refuting them are considered to be valid. Possibly the likelihood of arguments can be measured by probabilities. Then argumentation systems permit to de ne numerical degrees of support of hypotheses as the probability that arguments supporting the hypotheses are true. Similarly, numerical degrees of plausibility ofhypotheses can be de ned as the probability that arguments refuting the hypotheses do not hold. These probabilistic argumentation systems lead then to a Dempster-Shafer theory of evidence. In this paper rst an algebraic theory of argumentation systems is developped based on general logical consequence relations and the notion of an allocation of support. In particular a computational theory for argumentation systems using local computations on hypertrees is studied on the fundaments of Shafer's paper \An axiomatic study of computations in hypertrees". This is then extended to probabilistic argumentatio
    • …
    corecore