8,359 research outputs found
Updating Non-Additive Probabilities -- A Geometric Approach
A geometric approach, analogous to the approach used in the additive case, is proposed to determine the conditional expectation with non- additive probabilities. The conditional expectation is then applied for (i) updating the probability when new information becomes available; and (ii) defining the notion of independence of non-additive probabilities and Nash equilibrium.updating, non-additive probabilities, conditional expectation
Evidence functions: a compositional approach to information
The discrete case of Bayes’ formula is considered the paradigm of information acquisition. Prior and posterior probability functions, as well as likelihood functions, called evidence functions, are compositions following the Aitchison geometry of the simplex, and have thus vector character. Bayes’ formula becomes a vector addition. The Aitchison norm of an evidence function is introduced as a scalar measurement of information. A fictitious fire scenario serves as illustration. Two different inspections of affected houses are considered. Two questions are addressed: (a) which is the information provided by the outcomes of inspections, and (b) which is the most informative inspection.Peer ReviewedPostprint (author's final draft
Evidence functions: a compositional approach to information
The discrete case of Bayes’ formula is considered the paradigm of information acquisition. Prior and posterior probability functions, as well as likelihood functions, called evidence functions, are compositions following the Aitchison geometry of the simplex, and have thus vector character. Bayes’ formula becomes a vector addition. The Aitchison norm of an evidence function is introduced as a scalar measurement of information. A fictitious fire scenario serves as illustration. Two different inspections of affected houses are considered. Two questions are addressed: (a) which is the information provided by the outcomes of inspections, and (b) which is the most informative inspection.Peer Reviewe
Markov Chain Monte Carlo Based on Deterministic Transformations
In this article we propose a novel MCMC method based on deterministic
transformations T: X x D --> X where X is the state-space and D is some set
which may or may not be a subset of X. We refer to our new methodology as
Transformation-based Markov chain Monte Carlo (TMCMC). One of the remarkable
advantages of our proposal is that even if the underlying target distribution
is very high-dimensional, deterministic transformation of a one-dimensional
random variable is sufficient to generate an appropriate Markov chain that is
guaranteed to converge to the high-dimensional target distribution. Apart from
clearly leading to massive computational savings, this idea of
deterministically transforming a single random variable very generally leads to
excellent acceptance rates, even though all the random variables associated
with the high-dimensional target distribution are updated in a single block.
Since it is well-known that joint updating of many random variables using
Metropolis-Hastings (MH) algorithm generally leads to poor acceptance rates,
TMCMC, in this regard, seems to provide a significant advance. We validate our
proposal theoretically, establishing the convergence properties. Furthermore,
we show that TMCMC can be very effectively adopted for simulating from doubly
intractable distributions.
TMCMC is compared with MH using the well-known Challenger data, demonstrating
the effectiveness of of the former in the case of highly correlated variables.
Moreover, we apply our methodology to a challenging posterior simulation
problem associated with the geostatistical model of Diggle et al. (1998),
updating 160 unknown parameters jointly, using a deterministic transformation
of a one-dimensional random variable. Remarkable computational savings as well
as good convergence properties and acceptance rates are the results.Comment: 28 pages, 3 figures; Longer abstract inside articl
A method of classification for multisource data in remote sensing based on interval-valued probabilities
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method
Dynamically consistent Choquet random walk and real investments
In the real investments literature, the investigated cash flow is assumed to follow some known stochastic process (e.g. Brownian motion) and the criterion to decide between investments is the discounted utility of their cash flows. However, for most new investments the investor may be ambiguous about the representation of uncertainty. In order to take such ambiguity into account, we refer to a discounted Choquet expected utility in our model. In such a setting some problems are to dealt with: dynamical consistency, here it is obtained in a recursive model by a weakened version of the axiom. Mimicking the Brownian motion as the limit of a random walk for the investment payoff process, we describe the latter as a binomial tree with capacities instead of exact probabilities on its branches and show what are its properties at the limit. We show that most results in the real investments literature are tractable in this enlarged setting but leave more room to ambiguity as both the mean and the variance of the underlying stochastic process are modified in our ambiguous model.
Comparison of experts in the non-additive case
We adapt the model of comparisons of experts initiated by Lehrer («Comparison of experts JME 98») to a context of uncertainty which cannot be modelised by expected utility. We examine the robustness of Lehrer in this new context. Unlike expected utility, there exist several ways to define the strategies allowing to compare the experts, we propose some of them which guarantee a positive value of information.Non-additive preferences, experts
L\"uders' and quantum Jeffrey's rules as entropic projections
We prove that the standard quantum mechanical description of a quantum state
change due to measurement, given by Lueders' rules, is a special case of the
constrained maximisation of a quantum relative entropy functional. This result
is a quantum analogue of the derivation of the Bayes--Laplace rule as a special
case of the constrained maximisation of relative entropy. The proof is provided
for the Umegaki relative entropy of density operators over a Hilbert space as
well as for the Araki relative entropy of normal states over a W*-algebra. We
also introduce a quantum analogue of Jeffrey's rule, derive it in the same way
as above, and discuss the meaning of these results for quantum bayesianism
- …