38 research outputs found

    New bounds on classical and quantum one-way communication complexity

    Get PDF
    In this paper we provide new bounds on classical and quantum distributional communication complexity in the two-party, one-way model of communication. In the classical model, our bound extends the well known upper bound of Kremer, Nisan and Ron to include non-product distributions. We show that for a boolean function f:X x Y -> {0,1} and a non-product distribution mu on X x Y and epsilon in (0,1/2) constant: D_{epsilon}^{1, mu}(f)= O((I(X:Y)+1) vc(f)), where D_{epsilon}^{1, mu}(f) represents the one-way distributional communication complexity of f with error at most epsilon under mu; vc(f) represents the Vapnik-Chervonenkis dimension of f and I(X:Y) represents the mutual information, under mu, between the random inputs of the two parties. For a non-boolean function f:X x Y ->[k], we show a similar upper bound on D_{epsilon}^{1, mu}(f) in terms of k, I(X:Y) and the pseudo-dimension of f' = f/k. In the quantum one-way model we provide a lower bound on the distributional communication complexity, under product distributions, of a function f, in terms the well studied complexity measure of f referred to as the rectangle bound or the corruption bound of f . We show for a non-boolean total function f : X x Y -> Z and a product distribution mu on XxY, Q_{epsilon^3/8}^{1, mu}(f) = Omega(rec_ epsilon^{1, mu}(f)), where Q_{epsilon^3/8}^{1, mu}(f) represents the quantum one-way distributional communication complexity of f with error at most epsilon^3/8 under mu and rec_ epsilon^{1, mu}(f) represents the one-way rectangle bound of f with error at most epsilon under mu . Similarly for a non-boolean partial function f:XxY -> Z U {*} and a product distribution mu on X x Y, we show, Q_{epsilon^6/(2 x 15^4)}^{1, mu}(f) = Omega(rec_ epsilon^{1, mu}(f)).Comment: ver 1, 19 page

    On the communication complexity of sparse set disjointness and exists-equal problems

    Full text link
    In this paper we study the two player randomized communication complexity of the sparse set disjointness and the exists-equal problems and give matching lower and upper bounds (up to constant factors) for any number of rounds for both of these problems. In the sparse set disjointness problem, each player receives a k-subset of [m] and the goal is to determine whether the sets intersect. For this problem, we give a protocol that communicates a total of O(k\log^{(r)}k) bits over r rounds and errs with very small probability. Here we can take r=\log^{*}k to obtain a O(k) total communication \log^{*}k-round protocol with exponentially small error probability, improving on the O(k)-bits O(\log k)-round constant error probability protocol of Hastad and Wigderson from 1997. In the exist-equal problem, the players receive vectors x,y\in [t]^n and the goal is to determine whether there exists a coordinate i such that x_i=y_i. Namely, the exists-equal problem is the OR of n equality problems. Observe that exists-equal is an instance of sparse set disjointness with k=n, hence the protocol above applies here as well, giving an O(n\log^{(r)}n) upper bound. Our main technical contribution in this paper is a matching lower bound: we show that when t=\Omega(n), any r-round randomized protocol for the exists-equal problem with error probability at most 1/3 should have a message of size \Omega(n\log^{(r)}n). Our lower bound holds even for super-constant r <= \log^*n, showing that any O(n) bits exists-equal protocol should have \log^*n - O(1) rounds

    Tight Bounds on the R\'enyi Entropy via Majorization with Applications to Guessing and Compression

    Full text link
    This paper provides tight bounds on the R\'enyi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one. To that end, a tight lower bound on the R\'enyi entropy of a discrete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to R\'enyi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources.Comment: The paper was published in the Entropy journal (special issue on Probabilistic Methods in Information Theory, Hypothesis Testing, and Coding), vol. 20, no. 12, paper no. 896, November 22, 2018. Online available at https://www.mdpi.com/1099-4300/20/12/89
    corecore