256,106 research outputs found

    Complementary Lipschitz continuity results for the distribution of intersections or unions of independent random sets in finite discrete spaces

    Get PDF
    We prove that intersections and unions of independent random sets in finite spaces achieve a form of Lipschitz continuity. More precisely, given the distribution of a random set Ξ\Xi, the function mapping any random set distribution to the distribution of its intersection (under independence assumption) with Ξ\Xi is Lipschitz continuous with unit Lipschitz constant if the space of random set distributions is endowed with a metric defined as the LkL_k norm distance between inclusion functionals also known as commonalities. Moreover, the function mapping any random set distribution to the distribution of its union (under independence assumption) with Ξ\Xi is Lipschitz continuous with unit Lipschitz constant if the space of random set distributions is endowed with a metric defined as the LkL_k norm distance between hitting functionals also known as plausibilities. Using the epistemic random set interpretation of belief functions, we also discuss the ability of these distances to yield conflict measures. All the proofs in this paper are derived in the framework of Dempster-Shafer belief functions. Let alone the discussion on conflict measures, it is straightforward to transcribe the proofs into the general (non necessarily epistemic) random set terminology

    Density evolution for SUDOKU codes on the erasure channel

    Get PDF
    Codes based on SUDOKU puzzles are discussed, and belief propagation decoding introduced for the erasure channel. Despite the non-linearity of the code constraints, it is argued that density evolution can be used to analyse code performance due to the invariance of the code under alphabet permutation. The belief propagation decoder for erasure channels operates by exchanging messages containing sets of possible values. Accordingly, density evolution tracks the probability mass functions of the set cardinalities. The equations governing the mapping of those probability mass functions are derived and calculated for variable and constraint nodes, and decoding thresholds are computed for long SUDOKU codes with random interleavers.Funded in part by the European Research Council under ERC grant agreement 259663 and by the FP7 Network of Excellence NEWCOM# under grant agreement 318306.This is the accepted manuscript. The final version is available from IEEE at http://dx.doi.org/10.1109/ISTC.2014.6955120

    Choice functions as a tool to model uncertainty

    Get PDF
    Our aim is to develop a tool for modelling different types of assessments about the uncertain value of some random variable. One well-know and widely used way to model uncertainty is using probability mass functions. However, such probability mass functions are not general enough to model, for instance, a total lack of knowledge. A very successful tool for modelling more general types of assessments is coherent sets of desirable gambles. These have many applications in credal networks, predictive inference, conservative reasoning, and so on. However, they are not capable of modelling beliefs corresponding to 'or' statements, for example the belief that a coin has two equal sides of unknown type: either twice heads or twice tails. Such more general assessments can be modelled with coherent choice functions. The first thing we do is relate coherent choice functions to coherent sets of desirable gambles, which yields an expression for the most conservative coherent choice function compatible with a coherent set of desirable gambles. Next, we study the order-theoretic properties of coherent choice functions. In order for our theory of choice functions to be successful, we need a good conditioning rule. We propose a very intuitive one, and show that it coincides with the usual one for coherent sets of desirable gambles, and therefore also leads to Bayes’s rule. To conclude, we show how to elegantly deal with assessments of indifference

    The total belief theorem

    Get PDF
    In this paper, motivated by the treatment of conditional constraints in the data association problem, we state and prove the generalisation of the law of total probability to belief functions, as finite random sets. Our results apply to the case in which Dempster’s conditioning is employed. We show that the solution to the resulting total belief problem is in general not unique, whereas it is unique when the a-priori belief function is Bayesian. Examples and case studies underpin the theoretical contributions. Finally, our results are compared to previous related work on the generalisation of Jeffrey’s rule by Spies and Smets

    On the Privacy of Sublinear-Communication Jaccard Index Estimation via Min-hash Sketching

    Get PDF
    The min-hash sketch is a well-known technique for low-communication approximation of the Jaccard index between two input sets. Moreover, there is a folklore belief that min-hash sketch based protocols protect the privacy of the inputs. In this paper, we investigate this folklore to quantify the privacy of the min-hash sketch. We begin our investigation by considering the privacy of min-hash in a centralized setting where the hash functions are chosen by the min-hash functionality and are unknown to the participants. We show that in this case the min-hash output satisfies the standard definition of differential privacy (DP) without any additional noise. This immediately yields a privacy-preserving sublinear-communication semi-honest 2-PC protocol based on FHE where the hash function is evaluated homomorphically. To improve the efficiency of this protocol, we next consider an implementation in the random oracle model. Here, the protocol participants jointly sample public prefixes for domain separation of the random oracle, and locally evaluate the resulting hash functions on their input sets. Unfortunately, we show that in this public hash function setting, the min-hash output is no longer DP. We therefore consider the notion of distributional differential privacy (DDP) introduced by Bassily et al.~(FOCS 2013). We show that if the honest party\u27s set has sufficiently high min-entropy then the output of the min-hash functionality achieves DDP, again without any added noise. This yields a more efficient semi-honest two-party protocol in the random oracle model, where parties first locally hash their input sets and then perform a 2PC for comparison. By proving that our protocols satisfy DP and DDP respectively, our results formally confirm and qualify the folklore belief that min-hash based protocols protect the privacy of their inputs

    Random sets and exact confidence regions

    Full text link
    An important problem in statistics is the construction of confidence regions for unknown parameters. In most cases, asymptotic distribution theory is used to construct confidence regions, so any coverage probability claims only hold approximately, for large samples. This paper describes a new approach, using random sets, which allows users to construct exact confidence regions without appeal to asymptotic theory. In particular, if the user-specified random set satisfies a certain validity property, confidence regions obtained by thresholding the induced data-dependent plausibility function are shown to have the desired coverage probability.Comment: 14 pages, 2 figure

    Inferential models: A framework for prior-free posterior probabilistic inference

    Full text link
    Posterior probabilistic statistical inference without priors is an important but so far elusive goal. Fisher's fiducial inference, Dempster-Shafer theory of belief functions, and Bayesian inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This paper presents a new framework for probabilistic inference, based on inferential models (IMs), which not only provides data-dependent probabilistic measures of uncertainty about the unknown parameter, but does so with an automatic long-run frequency calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM's belief function under mild conditions. A corresponding optimality theory is developed, which helps to resolve the non-uniqueness issue. Several examples are presented to illustrate this new approach.Comment: 29 pages with 3 figures. Main text is the same as the published version. Appendix B is an addition, not in the published version, that contains some corrections and extensions of two of the main theorem
    • …
    corecore