1,923 research outputs found

    Differentially Private Numerical Vector Analyses in the Local and Shuffle Model

    Full text link
    Numerical vector aggregation plays a crucial role in privacy-sensitive applications, such as distributed gradient estimation in federated learning and statistical analysis of key-value data. In the context of local differential privacy, this study provides a tight minimax error bound of O(dsnϵ2)O(\frac{ds}{n\epsilon^2}), where dd represents the dimension of the numerical vector and ss denotes the number of non-zero entries. By converting the conditional/unconditional numerical mean estimation problem into a frequency estimation problem, we develop an optimal and efficient mechanism called Collision. In contrast, existing methods exhibit sub-optimal error rates of O(d2nϵ2)O(\frac{d^2}{n\epsilon^2}) or O(ds2nϵ2)O(\frac{ds^2}{n\epsilon^2}). Specifically, for unconditional mean estimation, we leverage the negative correlation between two frequencies in each dimension and propose the CoCo mechanism, which further reduces estimation errors for mean values compared to Collision. Moreover, to surpass the error barrier in local privacy, we examine privacy amplification in the shuffle model for the proposed mechanisms and derive precisely tight amplification bounds. Our experiments validate and compare our mechanisms with existing approaches, demonstrating significant error reductions for frequency estimation and mean estimation on numerical vectors.Comment: Full version of "Hiding Numerical Vectors in Local Private and Shuffled Messages" (IJCAI 2021
    • …
    corecore