1,158 research outputs found

    Unified lower bounds for interactive high-dimensional estimation under information constraints

    Full text link
    We consider the task of distributed parameter estimation using interactive protocols subject to local information constraints such as bandwidth limitations, local differential privacy, and restricted measurements. We provide a unified framework enabling us to derive a variety of (tight) minimax lower bounds for different parametric families of distributions, both continuous and discrete, under any â„“p\ell_p loss. Our lower bound framework is versatile and yields "plug-and-play" bounds that are widely applicable to a large range of estimation problems. In particular, our approach recovers bounds obtained using data processing inequalities and Cram\'er--Rao bounds, two other alternative approaches for proving lower bounds in our setting of interest. Further, for the families considered, we complement our lower bounds with matching upper bounds.Comment: Significant improvements: handle sparse parameter estimation, simplify and generalize argument

    On Collaboration in Distributed Parameter Estimation with Resource Constraints

    Full text link
    We study sensor/agent data collection and collaboration policies for parameter estimation, accounting for resource constraints and correlation between observations collected by distinct sensors/agents. Specifically, we consider a group of sensors/agents each samples from different variables of a multivariate Gaussian distribution and has different estimation objectives, and we formulate a sensor/agent's data collection and collaboration policy design problem as a Fisher information maximization (or Cramer-Rao bound minimization) problem. When the knowledge of correlation between variables is available, we analytically identify two particular scenarios: (1) where the knowledge of the correlation between samples cannot be leveraged for collaborative estimation purposes and (2) where the optimal data collection policy involves investing scarce resources to collaboratively sample and transfer information that is not of immediate interest and whose statistics are already known, with the sole goal of increasing the confidence on the estimate of the parameter of interest. When the knowledge of certain correlation is unavailable but collaboration may still be worthwhile, we propose novel ways to apply multi-armed bandit algorithms to learn the optimal data collection and collaboration policy in our distributed parameter estimation problem and demonstrate that the proposed algorithms, DOUBLE-F, DOUBLE-Z, UCB-F, UCB-Z, are effective through simulations

    Beyond Classical Statistics: Optimality In Transfer Learning And Distributed Learning

    Get PDF
    During modern statistical learning practice, statisticians are dealing with increasingly huge, complicated and structured data sets. New opportunities can be found during the learning process with better structured data sets as well as powerful data analytic resources. Also, there are more and more challenges we need to address when dealing with large data sets, due to limitation of computation, communication resources or privacy concerns. Under decision-theoretical framework, statistical optimality should be reconsidered with new type of data or new constraints. Under the framework of minimax theory, this thesis aims to address the following four problems:1. The first part of this thesis aims to develop an optimality theory for transfer learning for nonparametric classification. An near optimal adaptive classifier is also established. 2. In the second part, we study distributed Gaussian mean estimation with known vari- ance under communication constraints. The exact distributed minimax rate of con- vergence is derived under three different communication protocols. 3. In the third part, we study distributed Gaussian mean estimation with unknown vari- ance under communication constraints. The results show that the amount of additional communication cost depends on the type of underlying communication protocol. 4. In the fourth part, we investigate the minimax optimality and communication cost of adaptation for distributed nonparametric function estimation under communication constraints
    • …
    corecore