1,655 research outputs found

    A novel temporal perturbation based privacy-preserving scheme for real-time monitoring systems

    Get PDF
    In real-time monitoring systems, participant's privacy could be easily exposed when the time-series of sensing measurements are obtained accurately by adversaries. To address privacy issues, a number of privacy-preserving schemes have been designed for various monitoring applications. However, these schemes either lack considerations for temporal privacy or have less resistance to filtering attacks, or cause time delay with low utility. In this paper, we introduce a lightweight temporal perturbation based scheme, where sensor readings are buffered and disordered to obfuscate the temporal information of the original sensor measurement stream with differential privacy. Besides, we design the operations on the system server side to exploit the data utility in measurements from large number of sensors. We evaluate the performance of the proposed scheme through both rigorous theoretical analysis and extensive simulation experiments in comparison with related existing schemes. Evaluation results show that the proposed scheme manages to preserve both the temporal privacy and measurement privacy with filter-resistance, and achieves better performance in terms of computational overhead, data utility of real-time aggregation, and individual accumulation

    A novel temporal perturbation based privacy-preserving scheme for real-time monitoring systems

    Get PDF
    In real-time monitoring systems, participant’s privacy could be easily exposed when the time-series of sensing measurements are obtained accurately by adversaries. To address privacy issues, a number of privacy-preserving schemes have been designed for various monitoring applications. However, these schemes either lack considerations for temporal privacy or have less resistance to filtering attacks, or cause time delay with low utility. In this paper, we introduce a lightweight temporal perturbation based scheme, where sensor readings are buffered and disordered to obfuscate the temporal information of the original sensor measurement stream with differential privacy. Besides, we design the operations on the system server side to exploit the data utility in measurements from large number of sensors. We evaluate the performance of the proposed scheme through both rigorous theoretical analysis and extensive simulation experiments in comparison with related existing schemes. Evaluation results show that the proposed scheme manages to preserve both the temporal privacy and measurement privacy with filter-resistance, and achieves better performance in terms of computational overhead, data utility of real-time aggregation, and individual accumulation

    A novel temporal perturbation based privacy-preserving scheme for real-time monitoring systems

    Get PDF
    In real-time monitoring systems, participant’s privacy could be easily exposed when the time-series of sensing measurements are obtained accurately by adversaries. To address privacy issues, a number of privacy-preserving schemes have been designed for various monitoring applications. However, these schemes either lack considerations for temporal privacy or have less resistance to filtering attacks, or cause time delay with low utility. In this paper, we introduce a lightweight temporal perturbation based scheme, where sensor readings are buffered and disordered to obfuscate the temporal information of the original sensor measurement stream with differential privacy. Besides, we design the operations on the system server side to exploit the data utility in measurements from large number of sensors. We evaluate the performance of the proposed scheme through both rigorous theoretical analysis and extensive simulation experiments in comparison with related existing schemes. Evaluation results show that the proposed scheme manages to preserve both the temporal privacy and measurement privacy with filter-resistance, and achieves better performance in terms of computational overhead, data utility of real-time aggregation, and individual accumulation

    Emerging privacy challenges and approaches in CAV systems

    Get PDF
    The growth of Internet-connected devices, Internet-enabled services and Internet of Things systems continues at a rapid pace, and their application to transport systems is heralded as game-changing. Numerous developing CAV (Connected and Autonomous Vehicle) functions, such as traffic planning, optimisation, management, safety-critical and cooperative autonomous driving applications, rely on data from various sources. The efficacy of these functions is highly dependent on the dimensionality, amount and accuracy of the data being shared. It holds, in general, that the greater the amount of data available, the greater the efficacy of the function. However, much of this data is privacy-sensitive, including personal, commercial and research data. Location data and its correlation with identity and temporal data can help infer other personal information, such as home/work locations, age, job, behavioural features, habits, social relationships. This work categorises the emerging privacy challenges and solutions for CAV systems and identifies the knowledge gap for future research, which will minimise and mitigate privacy concerns without hampering the efficacy of the functions

    Smart Meter Privacy: A Utility-Privacy Framework

    Full text link
    End-user privacy in smart meter measurements is a well-known challenge in the smart grid. The solutions offered thus far have been tied to specific technologies such as batteries or assumptions on data usage. Existing solutions have also not quantified the loss of benefit (utility) that results from any such privacy-preserving approach. Using tools from information theory, a new framework is presented that abstracts both the privacy and the utility requirements of smart meter data. This leads to a novel privacy-utility tradeoff problem with minimal assumptions that is tractable. Specifically for a stationary Gaussian Markov model of the electricity load, it is shown that the optimal utility-and-privacy preserving solution requires filtering out frequency components that are low in power, and this approach appears to encompass most of the proposed privacy approaches.Comment: Accepted for publication and presentation at the IEEE SmartGridComm. 201

    Trading Indistinguishability-based Privacy and Utility of Complex Data

    Get PDF
    The collection and processing of complex data, like structured data or infinite streams, facilitates novel applications. At the same time, it raises privacy requirements by the data owners. Consequently, data administrators use privacy-enhancing technologies (PETs) to sanitize the data, that are frequently based on indistinguishability-based privacy definitions. Upon engineering PETs, a well-known challenge is the privacy-utility trade-off. Although literature is aware of a couple of trade-offs, there are still combinations of involved entities, privacy definition, type of data and application, in which we miss valuable trade-offs. In this thesis, for two important groups of applications processing complex data, we study (a) which indistinguishability-based privacy and utility requirements are relevant, (b) whether existing PETs solve the trade-off sufficiently, and (c) propose novel PETs extending the state-of-the-art substantially in terms of methodology, as well as achieved privacy or utility. Overall, we provide four contributions divided into two parts. In the first part, we study applications that analyze structured data with distance-based mining algorithms. We reveal that an essential utility requirement is the preservation of the pair-wise distances of the data items. Consequently, we propose distance-preserving encryption (DPE), together with a general procedure to engineer respective PETs by leveraging existing encryption schemes. As proof of concept, we apply it to SQL log mining, useful for database performance tuning. In the second part, we study applications that monitor query results over infinite streams. To this end, -event differential privacy is state-of-the-art. Here, PETs use mechanisms that typically add noise to query results. First, we study state-of-the-art mechanisms with respect to the utility they provide. Conducting the so far largest benchmark that fulfills requirements derived from limitations of prior experimental studies, we contribute new insights into the strengths and weaknesses of existing mechanisms. One of the most unexpected, yet explainable result, is a baseline supremacy. It states that one of the two baseline mechanisms delivers high or even the best utility. A natural follow-up question is whether baseline mechanisms already provide reasonable utility. So, second, we perform a case study from the area of electricity grid monitoring revealing two results. First, achieving reasonable utility is only possible under weak privacy requirements. Second, the utility measured with application-specific utility metrics decreases faster than the sanitization error, that is used as utility metric in most studies, suggests. As a third contribution, we propose a novel differential privacy-based privacy definition called Swellfish privacy. It allows tuning utility beyond incremental -event mechanism design by supporting time-dependent privacy requirements. Formally, as well as by experiments, we prove that it increases utility significantly. In total, our thesis contributes substantially to the research field, and reveals directions for future research

    Preserving Both Privacy and Utility in Network Trace Anonymization

    Full text link
    As network security monitoring grows more sophisticated, there is an increasing need for outsourcing such tasks to third-party analysts. However, organizations are usually reluctant to share their network traces due to privacy concerns over sensitive information, e.g., network and system configuration, which may potentially be exploited for attacks. In cases where data owners are convinced to share their network traces, the data are typically subjected to certain anonymization techniques, e.g., CryptoPAn, which replaces real IP addresses with prefix-preserving pseudonyms. However, most such techniques either are vulnerable to adversaries with prior knowledge about some network flows in the traces, or require heavy data sanitization or perturbation, both of which may result in a significant loss of data utility. In this paper, we aim to preserve both privacy and utility through shifting the trade-off from between privacy and utility to between privacy and computational cost. The key idea is for the analysts to generate and analyze multiple anonymized views of the original network traces; those views are designed to be sufficiently indistinguishable even to adversaries armed with prior knowledge, which preserves the privacy, whereas one of the views will yield true analysis results privately retrieved by the data owner, which preserves the utility. We present the general approach and instantiate it based on CryptoPAn. We formally analyze the privacy of our solution and experimentally evaluate it using real network traces provided by a major ISP. The results show that our approach can significantly reduce the level of information leakage (e.g., less than 1\% of the information leaked by CryptoPAn) with comparable utility
    • …
    corecore