13,439 research outputs found

    Quantifying Differential Privacy in Continuous Data Release under Temporal Correlations

    Get PDF
    Differential Privacy (DP) has received increasing attention as a rigorous privacy framework. Many existing studies employ traditional DP mechanisms (e.g., the Laplace mechanism) as primitives to continuously release private data for protecting privacy at each time point (i.e., event-level privacy), which assume that the data at different time points are independent, or that adversaries do not have knowledge of correlation between data. However, continuously generated data tend to be temporally correlated, and such correlations can be acquired by adversaries. In this paper, we investigate the potential privacy loss of a traditional DP mechanism under temporal correlations. First, we analyze the privacy leakage of a DP mechanism under temporal correlation that can be modeled using Markov Chain. Our analysis reveals that, the event-level privacy loss of a DP mechanism may \textit{increase over time}. We call the unexpected privacy loss \textit{temporal privacy leakage} (TPL). Although TPL may increase over time, we find that its supremum may exist in some cases. Second, we design efficient algorithms for calculating TPL. Third, we propose data releasing mechanisms that convert any existing DP mechanism into one against TPL. Experiments confirm that our approach is efficient and effective.Comment: accepted in TKDE special issue "Best of ICDE 2017". arXiv admin note: substantial text overlap with arXiv:1610.0754

    On Differential Privacy and Traffic State Estimation Problem for Connected Vehicles

    Full text link
    This letter focuses on the problem of traffic state estimation for highway networks with junctions in the form of on- and off-ramps while maintaining differential privacy of traffic data. Two types of sensors are considered, fixed sensors such as inductive loop detectors and connected vehicles which provide traffic density and speed data. The celebrated nonlinear second-order Aw-Rascle- Zhang (ARZ) model is utilized to model the traffic dynamics. The model is formulated as a nonlinear state-space difference equation. Sensitivity relations are derived for the given data which are then used to formulate a differentially private mechanism which adds a Gaussian noise to the data to make it differentially private. A Moving Horizon Estimation (MHE) approach is implemented for traffic state estimation using a linearized ARZ model. MHE is compared with Kalman Filter variants namely Extended Kalman Filter, Ensemble Kalman Filter and Unscented Kalman Filter. Several research and engineering questions are formulated and analysis is performed to find corresponding answers.Comment: TO APPEAR IN THE 61ST IEEE CONFERENCE ON DECISION AND CONTROL (CDC), CANCUN, MEXICO, DECEMBER 2022. arXiv admin note: text overlap with arXiv:2209.0284

    PRUDEnce: A system for assessing privacy risk vs utility in data sharing ecosystems

    Get PDF
    Data describing human activities are an important source of knowledge useful for understanding individual and collective behavior and for developing a wide range of user services. Unfortunately, this kind of data is sensitive, because people’s whereabouts may allow re-identification of individuals in a de-identified database. Therefore, Data Providers, before sharing those data, must apply any sort of anonymization to lower the privacy risks, but they must be aware and capable of controlling also the data quality, since these two factors are often a trade-off. In this paper we propose PRUDEnce (Privacy Risk versus Utility in Data sharing Ecosystems), a system enabling a privacy-aware ecosystem for sharing personal data. It is based on a methodology for assessing both the empirical (not theoretical) privacy risk associated to users represented in the data, and the data quality guaranteed only with users not at risk. Our proposal is able to support the Data Provider in the exploration of a repertoire of possible data transformations with the aim of selecting one specific transformation that yields an adequate trade-off between data quality and privacy risk. We study the practical effectiveness of our proposal over three data formats underlying many services, defined on real mobility data, i.e., presence data, trajectory data and road segment data

    Secure Mix-Zones for Privacy Protection of Road Network Location Based Services Users

    Get PDF
    corecore