45 research outputs found

    Communication Lower Bounds for Statistical Estimation Problems via a Distributed Data Processing Inequality

    Full text link
    We study the tradeoff between the statistical error and communication cost of distributed statistical estimation problems in high dimensions. In the distributed sparse Gaussian mean estimation problem, each of the mm machines receives nn data points from a dd-dimensional Gaussian distribution with unknown mean θ\theta which is promised to be kk-sparse. The machines communicate by message passing and aim to estimate the mean θ\theta. We provide a tight (up to logarithmic factors) tradeoff between the estimation error and the number of bits communicated between the machines. This directly leads to a lower bound for the distributed \textit{sparse linear regression} problem: to achieve the statistical minimax error, the total communication is at least Ω(min{n,d}m)\Omega(\min\{n,d\}m), where nn is the number of observations that each machine receives and dd is the ambient dimension. These lower results improve upon [Sha14,SD'14] by allowing multi-round iterative communication model. We also give the first optimal simultaneous protocol in the dense case for mean estimation. As our main technique, we prove a \textit{distributed data processing inequality}, as a generalization of usual data processing inequalities, which might be of independent interest and useful for other problems.Comment: To appear at STOC 2016. Fixed typos in theorem 4.5 and incorporated reviewers' suggestion

    Brascamp-Lieb Inequality and Its Reverse: An Information Theoretic View

    Full text link
    We generalize a result by Carlen and Cordero-Erausquin on the equivalence between the Brascamp-Lieb inequality and the subadditivity of relative entropy by allowing for random transformations (a broadcast channel). This leads to a unified perspective on several functional inequalities that have been gaining popularity in the context of proving impossibility results. We demonstrate that the information theoretic dual of the Brascamp-Lieb inequality is a convenient setting for proving properties such as data processing, tensorization, convexity and Gaussian optimality. Consequences of the latter include an extension of the Brascamp-Lieb inequality allowing for Gaussian random transformations, the determination of the multivariate Wyner common information for Gaussian sources, and a multivariate version of Nelson's hypercontractivity theorem. Finally we present an information theoretic characterization of a reverse Brascamp-Lieb inequality involving a random transformation (a multiple access channel).Comment: 5 pages; to be presented at ISIT 201

    Probabilistic Error Upper Bounds For Distributed Statistical Estimation

    Get PDF
    The size of modern datasets has spurred interest in distributed statistical estimation. We consider a scenario in which randomly drawn data is spread across a set of machines, and the task is to provide an estimate for the location parameter from which the data was drawn. We provide a one-shot protocol for computing this estimate which generalizes results from Braverman et al. [2], which provides a protocol under the assumption that the distribution is Gaussian, as well as from Duchi et al. [4], which assumes that the distribution is supported on the compact set [−1,1]. Like that of Braverman et al., our protocol is optimal in the case that the distribution is Gaussian
    corecore