117,654 research outputs found
Approximate Discrete Probability Distribution Representation using a Multi-ResolutionBinary Tree
Computing and storing probabilities is a hard problem as soon as one has to deal with complex distributions over multiples random variables. The problem of efficient representation of probability distributions is central in term of computational efficiency in the field of probabilistic reasoning. The main problem arises when dealing with joint probability distributions over a set of random variables: they are always represented using huge probability arrays. In this paper, a new method based on a binary-tree representation
is introduced in order to store efficiently very large joint distributions. Our approach approximates any multidimensional joint distributions using an adaptive discretization of the space. We make the assumption that the lower is the probability mass of a particular region of feature space, the larger is the discretization step. This assumption leads to a very optimized representation in term of time and memory. The other advantages of our approach are the ability to refine dynamically the distribution every time it is needed leading to a more accurate representation of the probability
distribution and to an anytime representation of the distribution
Why the Quantum Must Yield to Gravity
After providing an extensive overview of the conceptual elements -- such as
Einstein's `hole argument' -- that underpin Penrose's proposal for
gravitationally induced quantum state reduction, the proposal is constructively
criticised. Penrose has suggested a mechanism for objective reduction of
quantum states with postulated collapse time T = h/E, where E is an
ill-definedness in the gravitational self-energy stemming from the profound
conflict between the principles of superposition and general covariance. Here
it is argued that, even if Penrose's overall conceptual scheme for the
breakdown of quantum mechanics is unreservedly accepted, his formula for the
collapse time of superpositions reduces to T --> oo (E --> 0) in the strictly
Newtonian regime, which is the domain of his proposed experiment to corroborate
the effect. A suggestion is made to rectify this situation. In particular,
recognising the cogency of Penrose's reasoning in the domain of full `quantum
gravity', it is demonstrated that an appropriate experiment which could in
principle corroborate his argued `macroscopic' breakdown of superpositions is
not the one involving non-rotating mass distributions as he has suggested, but
a Leggett-type SQUID or BEC experiment involving superposed mass distributions
in relative rotation. The demonstration thereby brings out one of the
distinctive characteristics of Penrose's scheme, rendering it empirically
distinguishable from other state reduction theories involving gravity. As an
aside, a new geometrical measure of gravity-induced deviation from quantum
mechanics in the manner of Penrose is proposed, but now for the canonical
commutation relations [Q, P] = ih.Comment: 33 pages (TeX, uses mtexsis) plus 3 figures (epsf). To appear in
``Physics Meets Philosophy at the Planck Scale'' (Cambridge University
Press). Two footnotes adde
Recommended from our members
Conservative reasoning about epistemic uncertainty for the probability of failure on demand of a 1-out-of-2 software-based system in which one channel is “possibly perfect”
In earlier work, (Littlewood and Rushby 2012) (henceforth LR), an analysis was presented of a 1-out-of-2 software-based system in which one channel was “possibly perfect”. It was shown that, at the aleatory level, the system pfd (probability of failure on demand) could be bounded above by the product of the pfd of channel A and the pnp (probability of non-perfection) of channel B. This result was presented as a way of avoiding the well-known difficulty that for two certainly-fallible channels, failures of the two will be dependent, i.e. the system pfd cannot be expressed simply as a product of the channel pfds. A price paid in this new approach for avoiding the issue of failure dependence is that the result is conservative. Furthermore, a complete analysis requires that account be taken of epistemic uncertainty – here concerning the numeric values of the two parameters pfdA and pnpB. Unfortunately this introduces a different difficult problem of dependence: estimating the dependence between an assessor’s beliefs about the parameters. The work reported here avoids this problem by obtaining results that require only an assessor’s marginal beliefs about the individual channels, i.e. they do not require knowledge of the dependence between these beliefs. The price paid is further conservatism in the results
An Empirical Comparison of Three Inference Methods
In this paper, an empirical evaluation of three inference methods for
uncertain reasoning is presented in the context of Pathfinder, a large expert
system for the diagnosis of lymph-node pathology. The inference procedures
evaluated are (1) Bayes' theorem, assuming evidence is conditionally
independent given each hypothesis; (2) odds-likelihood updating, assuming
evidence is conditionally independent given each hypothesis and given the
negation of each hypothesis; and (3) a inference method related to the
Dempster-Shafer theory of belief. Both expert-rating and decision-theoretic
metrics are used to compare the diagnostic accuracy of the inference methods.Comment: Appears in Proceedings of the Fourth Conference on Uncertainty in
Artificial Intelligence (UAI1988
There Is No Pure Empirical Reasoning
The justificatory force of empirical reasoning always depends upon the existence of some synthetic, a priori justification. The reasoner must begin with justified, substantive constraints on both the prior probability of the conclusion and certain conditional probabilities; otherwise, all possible degrees of belief in the conclusion are left open given the premises. Such constraints cannot in general be empirically justified, on pain of infinite regress. Nor does subjective Bayesianism offer a way out for the empiricist. Despite often-cited convergence theorems, subjective Bayesians cannot hold that any empirical hypothesis is ever objectively justified in the relevant sense. Rationalism is thus the only alternative to an implausible skepticism
Complementary Lipschitz continuity results for the distribution of intersections or unions of independent random sets in finite discrete spaces
We prove that intersections and unions of independent random sets in finite
spaces achieve a form of Lipschitz continuity. More precisely, given the
distribution of a random set , the function mapping any random set
distribution to the distribution of its intersection (under independence
assumption) with is Lipschitz continuous with unit Lipschitz constant if
the space of random set distributions is endowed with a metric defined as the
norm distance between inclusion functionals also known as commonalities.
Moreover, the function mapping any random set distribution to the distribution
of its union (under independence assumption) with is Lipschitz continuous
with unit Lipschitz constant if the space of random set distributions is
endowed with a metric defined as the norm distance between hitting
functionals also known as plausibilities.
Using the epistemic random set interpretation of belief functions, we also
discuss the ability of these distances to yield conflict measures. All the
proofs in this paper are derived in the framework of Dempster-Shafer belief
functions. Let alone the discussion on conflict measures, it is straightforward
to transcribe the proofs into the general (non necessarily epistemic) random
set terminology
- …