73,890 research outputs found
General combination rules for qualitative and quantitative beliefs
Martin and Osswald \cite{Martin07} have recently proposed many
generalizations of combination rules on quantitative beliefs in order to manage
the conflict and to consider the specificity of the responses of the experts.
Since the experts express themselves usually in natural language with
linguistic labels, Smarandache and Dezert \cite{Li07} have introduced a
mathematical framework for dealing directly also with qualitative beliefs. In
this paper we recall some element of our previous works and propose the new
combination rules, developed for the fusion of both qualitative or quantitative
beliefs
Ignorance and indifference
The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique, ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two instruments, each individually strong enough to identify this state. They are the principle of indifference (PI) and the notion that ignorance is invariant under certain redescriptions of the outcome space, here developed into the 'principle of invariance of ignorance' (PII). Both instruments are so innocuous as almost to be platitudes. Yet the literature in probabilistic epistemology has misdiagnosed them as paradoxical or defective since they generate inconsistencies when conjoined with the assumption that an epistemic state must be a probability distribution. To underscore the need to drop this assumption, I express PII in its most defensible form as relating symmetric descriptions and show that paradoxes still arise if we assume the ignorance state to be a probability distribution. Copyright 2008 by the Philosophy of Science Association. All rights reserved
A Simple Proportional Conflict Redistribution Rule
One proposes a first alternative rule of combination to WAO (Weighted Average
Operator) proposed recently by Josang, Daniel and Vannoorenberghe, called
Proportional Conflict Redistribution rule (denoted PCR1). PCR1 and WAO are
particular cases of WO (the Weighted Operator) because the conflicting mass is
redistributed with respect to some weighting factors. In this first PCR rule,
the proportionalization is done for each non-empty set with respect to the
non-zero sum of its corresponding mass matrix - instead of its mass column
average as in WAO, but the results are the same as Ph. Smets has pointed out.
Also, we extend WAO (which herein gives no solution) for the degenerate case
when all column sums of all non-empty sets are zero, and then the conflicting
mass is transferred to the non-empty disjunctive form of all non-empty sets
together; but if this disjunctive form happens to be empty, then one considers
an open world (i.e. the frame of discernment might contain new hypotheses) and
thus all conflicting mass is transferred to the empty set. In addition to WAO,
we propose a general formula for PCR1 (WAO for non-degenerate cases).Comment: 21 page
An introduction to DSmT
The management and combination of uncertain, imprecise, fuzzy and even
paradoxical or high conflicting sources of information has always been, and
still remains today, of primal importance for the development of reliable
modern information systems involving artificial reasoning. In this
introduction, we present a survey of our recent theory of plausible and
paradoxical reasoning, known as Dezert-Smarandache Theory (DSmT), developed for
dealing with imprecise, uncertain and conflicting sources of information. We
focus our presentation on the foundations of DSmT and on its most important
rules of combination, rather than on browsing specific applications of DSmT
available in literature. Several simple examples are given throughout this
presentation to show the efficiency and the generality of this new approach
Nonparametric Bounds and Sensitivity Analysis of Treatment Effects
This paper considers conducting inference about the effect of a treatment (or
exposure) on an outcome of interest. In the ideal setting where treatment is
assigned randomly, under certain assumptions the treatment effect is
identifiable from the observable data and inference is straightforward.
However, in other settings such as observational studies or randomized trials
with noncompliance, the treatment effect is no longer identifiable without
relying on untestable assumptions. Nonetheless, the observable data often do
provide some information about the effect of treatment, that is, the parameter
of interest is partially identifiable. Two approaches are often employed in
this setting: (i) bounds are derived for the treatment effect under minimal
assumptions, or (ii) additional untestable assumptions are invoked that render
the treatment effect identifiable and then sensitivity analysis is conducted to
assess how inference about the treatment effect changes as the untestable
assumptions are varied. Approaches (i) and (ii) are considered in various
settings, including assessing principal strata effects, direct and indirect
effects and effects of time-varying exposures. Methods for drawing formal
inference about partially identified parameters are also discussed.Comment: Published in at http://dx.doi.org/10.1214/14-STS499 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …