5,704 research outputs found

    How to Generate and use Universal Samplers

    Get PDF
    The random oracle is an idealization that allows us to model a hash function as an oracle that will output a uniformly random string given any input. We introduce the notion of a universal sampler scheme that extends the notion of a random oracle, to a method of sampling securely from arbitrary distributions. We describe several applications that provide a natural motivation for this notion; these include generating the trusted parameters for many schemes from just a single trusted setup. We further demonstrate the versatility of universal samplers by showing how they give rise to simple constructions of identity-based encryption and multiparty key exchange. In particular, we construct adaptively secure non-interactive multiparty key exchange in the random oracle model based on indistinguishability obfuscation; obtaining the first known construction of adaptively secure NIKE without complexity leveraging. We give a solution that shows how to transform any random oracle into a universal sampler scheme, based on indistinguishability obfuscation. At the heart of our construction and proof is a new technique we call “delayed backdoor programming” that we believe will have other applications

    Sampling of min-entropy relative to quantum knowledge

    Full text link
    Let X_1, ..., X_n be a sequence of n classical random variables and consider a sample of r positions selected at random. Then, except with (exponentially in r) small probability, the min-entropy of the sample is not smaller than, roughly, a fraction r/n of the total min-entropy of all positions X_1, ..., X_n, which is optimal. Here, we show that this statement, originally proven by Vadhan [LNCS, vol. 2729, Springer, 2003] for the purely classical case, is still true if the min-entropy is measured relative to a quantum system. Because min-entropy quantifies the amount of randomness that can be extracted from a given random variable, our result can be used to prove the soundness of locally computable extractors in a context where side information might be quantum-mechanical. In particular, it implies that key agreement in the bounded-storage model (using a standard sample-and-hash protocol) is fully secure against quantum adversaries, thus solving a long-standing open problem.Comment: 48 pages, late

    Flexible constrained sampling with guarantees for pattern mining

    Get PDF
    Pattern sampling has been proposed as a potential solution to the infamous pattern explosion. Instead of enumerating all patterns that satisfy the constraints, individual patterns are sampled proportional to a given quality measure. Several sampling algorithms have been proposed, but each of them has its limitations when it comes to 1) flexibility in terms of quality measures and constraints that can be used, and/or 2) guarantees with respect to sampling accuracy. We therefore present Flexics, the first flexible pattern sampler that supports a broad class of quality measures and constraints, while providing strong guarantees regarding sampling accuracy. To achieve this, we leverage the perspective on pattern mining as a constraint satisfaction problem and build upon the latest advances in sampling solutions in SAT as well as existing pattern mining algorithms. Furthermore, the proposed algorithm is applicable to a variety of pattern languages, which allows us to introduce and tackle the novel task of sampling sets of patterns. We introduce and empirically evaluate two variants of Flexics: 1) a generic variant that addresses the well-known itemset sampling task and the novel pattern set sampling task as well as a wide range of expressive constraints within these tasks, and 2) a specialized variant that exploits existing frequent itemset techniques to achieve substantial speed-ups. Experiments show that Flexics is both accurate and efficient, making it a useful tool for pattern-based data exploration.Comment: Accepted for publication in Data Mining & Knowledge Discovery journal (ECML/PKDD 2017 journal track

    MCMC with Strings and Branes: The Suburban Algorithm (Extended Version)

    Get PDF
    Motivated by the physics of strings and branes, we develop a class of Markov chain Monte Carlo (MCMC) algorithms involving extended objects. Starting from a collection of parallel Metropolis-Hastings (MH) samplers, we place them on an auxiliary grid, and couple them together via nearest neighbor interactions. This leads to a class of "suburban samplers" (i.e., spread out Metropolis). Coupling the samplers in this way modifies the mixing rate and speed of convergence for the Markov chain, and can in many cases allow a sampler to more easily overcome free energy barriers in a target distribution. We test these general theoretical considerations by performing several numerical experiments. For suburban samplers with a fluctuating grid topology, performance is strongly correlated with the average number of neighbors. Increasing the average number of neighbors above zero initially leads to an increase in performance, though there is a critical connectivity with effective dimension d_eff ~ 1, above which "groupthink" takes over, and the performance of the sampler declines.Comment: v2: 55 pages, 13 figures, references and clarifications added. Published version. This article is an extended version of "MCMC with Strings and Branes: The Suburban Algorithm

    No imminent quantum supremacy by boson sampling

    Get PDF
    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom

    Counting and Generating Terms in the Binary Lambda Calculus (Extended version)

    Get PDF
    In a paper entitled Binary lambda calculus and combinatory logic, John Tromp presents a simple way of encoding lambda calculus terms as binary sequences. In what follows, we study the numbers of binary strings of a given size that represent lambda terms and derive results from their generating functions, especially that the number of terms of size n grows roughly like 1.963447954. .. n. In a second part we use this approach to generate random lambda terms using Boltzmann samplers.Comment: extended version of arXiv:1401.037
    corecore