47 research outputs found

    Strong direct product conjecture holds for all relations in public coin randomized one-way communication complexity

    Full text link
    Let f subset of X x Y x Z be a relation. Let the public coin one-way communication complexity of f, with worst case error 1/3, be denoted R^{1,pub}_{1/3}(f). We show that if for computing f^k (k independent copies of f), o(k R^{1,pub}_{1/3}(f)) communication is provided, then the success is exponentially small in k. This settles the strong direct product conjecture for all relations in public coin one-way communication complexity. We show a new tight characterization of public coin one-way communication complexity which strengthens on the tight characterization shown in [J., Klauck, Nayak 08]. We use the new characterization to show our direct product result and this may also be of independent interest.Comment: ver 2. 11 pages, proofs simplifie

    Quantum Advantage on Information Leakage for Equality

    Full text link
    We prove a lower bound on the information leakage of any classical protocol computing the equality function in the simultaneous message passing (SMP) model. Our bound is valid in the finite length regime and is strong enough to demonstrate a quantum advantage in terms of information leakage for practical quantum protocols. We prove our bound by obtaining an improved finite size version of the communication bound due to Babai and Kimmel, relating randomized communication to deterministic communication in the SMP model. We then relate information leakage to randomized communication through a series of reductions. We first provide alternative characterizations for information leakage, allowing us to link it to average length communication while allowing for shared randomness (pairwise, with the referee). A Markov inequality links this with bounded length communication, and a Newman type argument allows us to go from shared to private randomness. The only reduction in which we incur more than a logarithmic additive factor is in the Markov inequality; in particular, our compression method is essentially tight for the SMP model with average length communication.Comment: 23 pages, 2 figure

    One-Shot Federated Learning: Theoretical Limits and Algorithms to Achieve Them

    Full text link
    We consider distributed statistical optimization in one-shot setting, where there are mm machines each observing nn i.i.d. samples. Based on its observed samples, each machine sends a BB-bit-long message to a server. The server then collects messages from all machines, and estimates a parameter that minimizes an expected convex loss function. We investigate the impact of communication constraint, BB, on the expected error and derive a tight lower bound on the error achievable by any algorithm. We then propose an estimator, which we call Multi-Resolution Estimator (MRE), whose expected error (when BlogmnB\ge\log mn) meets the aforementioned lower bound up to poly-logarithmic factors, and is thereby order optimal. We also address the problem of learning under tiny communication budget, and present lower and upper error bounds when BB is a constant. The expected error of MRE, unlike existing algorithms, tends to zero as the number of machines (mm) goes to infinity, even when the number of samples per machine (nn) remains upper bounded by a constant. This property of the MRE algorithm makes it applicable in new machine learning paradigms where mm is much larger than nn

    New Results in the Simultaneous Message Passing Model

    Full text link
    Consider the following Simultaneous Message Passing (SMP) model for computing a relation f subset of X x Y x Z. In this model Alice, on input x in X and Bob, on input y in Y, send one message each to a third party Referee who then outputs a z in Z such that (x,y,z) in f. We first show optimal 'Direct sum' results for all relations f in this model, both in the quantum and classical settings, in the situation where we allow shared resources (shared entanglement in quantum protocols and public coins in classical protocols) between Alice and Referee and Bob and Referee and no shared resource between Alice and Bob. This implies that, in this model, the communication required to compute k simultaneous instances of f, with constant success overall, is at least k-times the communication required to compute one instance with constant success. This in particular implies an earlier Direct sum result, shown by Chakrabarti, Shi, Wirth and Yao, 2001, for the Equality function (and a class of other so-called robust functions), in the classical smp model with no shared resources between any parties. Furthermore we investigate the gap between the smp model and the one-way model in communication complexity and exhibit a partial function that is exponentially more expensive in the former if quantum communication with entanglement is allowed, compared to the latter even in the deterministic case.Comment: 16 pages, version

    On Communication Cost of Distributed Statistical Estimation and Dimensionality

    Full text link
    We explore the connection between dimensionality and communication cost in distributed learning problems. Specifically we study the problem of estimating the mean θ\vec{\theta} of an unknown dd dimensional gaussian distribution in the distributed setting. In this problem, the samples from the unknown distribution are distributed among mm different machines. The goal is to estimate the mean θ\vec{\theta} at the optimal minimax rate while communicating as few bits as possible. We show that in this setting, the communication cost scales linearly in the number of dimensions i.e. one needs to deal with different dimensions individually. Applying this result to previous lower bounds for one dimension in the interactive setting \cite{ZDJW13} and to our improved bounds for the simultaneous setting, we prove new lower bounds of Ω(md/log(m))\Omega(md/\log(m)) and Ω(md)\Omega(md) for the bits of communication needed to achieve the minimax squared loss, in the interactive and simultaneous settings respectively. To complement, we also demonstrate an interactive protocol achieving the minimax squared loss with O(md)O(md) bits of communication, which improves upon the simple simultaneous protocol by a logarithmic factor. Given the strong lower bounds in the general setting, we initiate the study of the distributed parameter estimation problems with structured parameters. Specifically, when the parameter is promised to be ss-sparse, we show a simple thresholding based protocol that achieves the same squared loss while saving a d/sd/s factor of communication. We conjecture that the tradeoff between communication and squared loss demonstrated by this protocol is essentially optimal up to logarithmic factor.Comment: to appear at NIPS'14 with oral presentatio

    A lower bound for bounded round quantum communication complexity of set disjointness

    Full text link
    We consider the class of functions whose value depends only on the intersection of the input X_1,X_2, ..., X_t; that is, for each F in this class there is an f_F: 2^{[n]} \to {0,1}, such that F(X_1,X_2, ..., X_t) = f_F(X_1 \cap X_2 \cap ... \cap X_t). We show that the t-party k-round communication complexity of F is Omega(s_m(f_F)/(k^2)), where s_m(f_F) stands for the `monotone sensitivity of f_F' and is defined by s_m(f_F) \defeq max_{S\subseteq [n]} |{i: f_F(S \cup {i}) \neq f_F(S)|. For two-party quantum communication protocols for the set disjointness problem, this implies that the two parties must exchange Omega(n/k^2) qubits. For k=1, our lower bound matches the Omega(n) lower bound observed by Buhrman and de Wolf (based on a result of Nayak, and for 2 <= k <= n^{1/4}, improves the lower bound of Omega(sqrt{n}) shown by Razborov. (For protocols with no restrictions on the number of rounds, we can conclude that the two parties must exchange Omega(n^{1/3}) qubits. This, however, falls short of the optimal Omega(sqrt{n}) lower bound shown by Razborov.)Comment: 15 pages, content added and modified, references adde

    R\'enyi Information Complexity and an Information Theoretic Characterization of the Partition Bound

    Full text link
    We introduce a new information-theoretic complexity measure ICIC_\infty for 2-party functions which is a lower-bound on communication complexity, and has the two leading lower-bounds on communication complexity as its natural relaxations: (external) information complexity (ICIC) and logarithm of partition complexity (prt\text{prt}), which have so far appeared conceptually quite different from each other. ICIC_\infty is an external information complexity measure based on R\'enyi mutual information of order infinity. In the definition of ICIC_\infty, relaxing the order of R\'enyi mutual information from infinity to 1 yields ICIC, while logprt\log \text{prt} is obtained by replacing protocol transcripts with what we term "pseudotranscripts," which omits the interactive nature of a protocol, but only requires that the probability of any transcript given the inputs xx and yy to the two parties, factorizes into two terms which depend on xx and yy separately. Further understanding ICIC_\infty might have consequences for important direct-sum problems in communication complexity, as it lies between communication complexity and information complexity. We also show that applying both the above relaxations simultaneously to ICIC_\infty gives a complexity measure that is lower-bounded by the (log of) relaxed partition complexity, a complexity measure introduced by Kerenidis et al. (FOCS 2012). We obtain a sharper connection between (external) information complexity and relaxed partition complexity than Kerenidis et al., using an arguably more direct proof.Comment: Full version of paper appearing at ICALP 201

    Direct Sum Theorem for Bounded Round Quantum Communication Complexity

    Full text link
    We prove a direct sum theorem for bounded round entanglement-assisted quantum communication complexity. To do so, we use the fully quantum definition for information cost and complexity that we recently introduced, and use both the fact that information is a lower bound on the communication, and the fact that a direct sum property holds for quantum information complexity. We then give a protocol for compressing a single copy of a protocol down to its quantum information cost, up to terms depending on the number of rounds and the allowed increase in error. Two important tools to derive this protocol are a smooth conditional min-entropy bound for a one-shot quantum state redistribution protocol, and the quantum substate theorem of Jain, Radhakrishnan and Sen (FOCS'02) to transform this bound into a von Neumann conditional entropy bound. This result further establishes the newly introduced notions of quantum information cost and complexity as the correct quantum generalisations of the classical ones in the standard communication complexity setting. Finding such a quantum generalization of information complexity was one of the open problem recently raised by Braverman (STOC'12).Comment: 25 pages, no figure, part of prelims taken from arXiv:1404.373

    The information cost of quantum memoryless protocols

    Full text link
    We consider memoryless quantum communication protocols, where the two parties do not possess any memory besides their classical input and they take turns performing unitary operations on a pure quantum state that they exchange between them. Most known quantum protocols are of this type and recently a deep connection between memoryless protocols and Bell inequality violations has been explored recently by Buhrman et al. We study the information cost of memoryless quantum protocols by looking at a canonical problem: bounded-round quantum communication protocols for the one-bit AND function. We prove directly a tight lower bound of Θ(log(k)/k)\Theta(\log(k) / k) for the information cost of AND for kk-round memoryless quantum protocols and for the input distribution needed for the Disjointness function. It is not clear if memoryless protocols allow for a reduction between AND and Disjointness, due to the absence of private workspaces. We enhance the model by allowing the players to keep in their private classical workspace apart from their input also some classical private coins. Surprisingly, we show that every quantum protocol can be transformed into an equivalent quantum protocol with private coins that is perfectly private, i.e. the players only learn the value of the function and nothing more. Last, we consider the model where the players are allowed to use one-shot coins, i.e. private coins that can be used only once during the protocol. While in the classical case, private coins and one-shot coins are equivalent, in the quantum case, we prove that they are not. More precisely, we show that every quantum memoryless protocol with one-bit inputs that uses one-shot coins can be transformed into a memoryless quantum protocol without private coins and without increasing too much its information cost. Hence, while private coins always allow for private quantum protocols, one-shot coins do not.Comment: Corrected typos in the abstrac

    A strong direct product theorem in terms of the smooth rectangle bound

    Full text link
    A strong direct product theorem states that, in order to solve k instances of a problem, if we provide less than k times the resource required to compute one instance, then the probability of overall success is exponentially small in k. In this paper, we consider the model of two-way public-coin communication complexity and show a strong direct product theorem for all relations in terms of the smooth rectangle bound, introduced by Jain and Klauck as a generic lower bound method in this model. Our result therefore uniformly implies a strong direct product theorem for all relations for which an (asymptotically) optimal lower bound can be provided using the smooth rectangle bound, for example Inner Product, Greater-Than, Set-Disjointness, Gap-Hamming Distance etc. Our result also implies near optimal direct product results for several important functions and relations used to show exponential separations between classical and quantum communication complexity, for which near optimal lower bounds are provided using the rectangle bound, for example by Raz [1999], Gavinsky [2008] and Klartag and Regev [2011]. In fact we are not aware of any relation for which it is known that the smooth rectangle bound does not provide an optimal lower bound. This lower bound subsumes many of the other lower bound methods, for example the rectangle bound (a.k.a the corruption bound), the smooth discrepancy bound (a.k.a the \gamma_2 bound) which in turn subsumes the discrepancy bound, the subdistribution bound and the conditional min-entropy bound. We show our result using information theoretic arguments. A key tool we use is a sampling protocol due to Braverman [2012], in fact a modification of it used by Kerenidis, Laplante, Lerays, Roland and Xiao [2012].Comment: 12 pages, version 3, improved parameters in the main resul
    corecore