9 research outputs found

    A Privacy-Preserving Asynchronous Averaging Algorithm based on Shamir’s Secret Sharing

    Get PDF

    Privacy-Preserving Distributed Average Consensus based on Additive Secret Sharing

    Get PDF

    ODIN: Obfuscation-based privacy-preserving consensus algorithm for Decentralized Information fusion in smart device Networks

    Get PDF
    The large spread of sensors and smart devices in urban infrastructures are motivating research in the area of the Internet of Things (IoT) to develop new services and improve citizens’ quality of life. Sensors and smart devices generate large amounts of measurement data from sensing the environment, which is used to enable services such as control of power consumption or traffic density. To deal with such a large amount of information and provide accurate measurements, service providers can adopt information fusion, which given the decentralized nature of urban deployments can be performed by means of consensus algorithms. These algorithms allow distributed agents to (iteratively) compute linear functions on the exchanged data, and take decisions based on the outcome, without the need for the support of a central entity. However, the use of consensus algorithms raises several security concerns, especially when private or security critical information is involved in the computation. In this article we propose ODIN, a novel algorithm allowing information fusion over encrypted data. ODIN is a privacy-preserving extension of the popular consensus gossip algorithm, which prevents distributed agents from having direct access to the data while they iteratively reach consensus; agents cannot access even the final consensus value but can only retrieve partial information (e.g., a binary decision). ODIN uses efficient additive obfuscation and proxy re-encryption during the update steps and garbled circuits to make final decisions on the obfuscated consensus. We discuss the security of our proposal and show its practicability and efficiency on real-world resource-constrained devices, developing a prototype implementation for Raspberry Pi devices

    ODIN: Obfuscation-based privacy-preserving consensus algorithm for Decentralized Information fusion in smart device Networks

    Get PDF
    The large spread of sensors and smart devices in urban infrastructures are motivating research in the area of the Internet of Things (IoT) to develop new services and improve citizens’ quality of life. Sensors and smart devices generate large amounts of measurement data from sensing the environment, which is used to enable services such as control of power consumption or traffic density. To deal with such a large amount of information and provide accurate measurements, service providers can adopt information fusion, which given the decentralized nature of urban deployments can be performed by means of consensus algorithms. These algorithms allow distributed agents to (iteratively) compute linear functions on the exchanged data, and take decisions based on the outcome, without the need for the support of a central entity. However, the use of consensus algorithms raises several security concerns, especially when private or security critical information is involved in the computation. In this article we propose ODIN, a novel algorithm allowing information fusion over encrypted data. ODIN is a privacy-preserving extension of the popular consensus gossip algorithm, which prevents distributed agents from having direct access to the data while they iteratively reach consensus; agents cannot access even the final consensus value but can only retrieve partial information (e.g., a binary decision). ODIN uses efficient additive obfuscation and proxy re-encryption during the update steps and garbled circuits to make final decisions on the obfuscated consensus. We discuss the security of our proposal and show its practicability and efficiency on real-world resource-constrained devices, developing a prototype implementation for Raspberry Pi devices

    On the Relationship Between Inference and Data Privacy in Decentralized IoT Networks

    Full text link
    In a decentralized Internet of Things (IoT) network, a fusion center receives information from multiple sensors to infer a public hypothesis of interest. To prevent the fusion center from abusing the sensor information, each sensor sanitizes its local observation using a local privacy mapping, which is designed to achieve both inference privacy of a private hypothesis and data privacy of the sensor raw observations. Various inference and data privacy metrics have been proposed in the literature. We introduce the concepts of privacy implication and non-guarantee to study the relationships between these privacy metrics. We propose an optimization framework in which both local differential privacy (data privacy) and information privacy (inference privacy) metrics are incorporated. In the parametric case where sensor observations' distributions are known \emph{a priori}, we propose a two-stage local privacy mapping at each sensor, and show that such an architecture is able to achieve information privacy and local differential privacy to within the predefined budgets. For the nonparametric case where sensor distributions are unknown, we adopt an empirical optimization approach. Simulation and experiment results demonstrate that our proposed approaches allow the fusion center to accurately infer the public hypothesis while protecting both inference and data privacy

    Learning with privacy in consensus + obfuscation

    No full text
    We examine the interplay between learning and privacy over multiagent consensus networks. The learning objective of each individual agent consists of computing some global network statistic, and is accomplished by means of a consensus protocol. The privacy objective consists of preventing inference of the individual agents' data from the information exchanged during the consensus stages, and is accomplished by adding some artificial noise to the observations (obfuscation). An analytical characterization of the learning and privacy performance is provided, with reference to a consensus perturbing and to a consensus-preserving obfuscation strategy

    Learning With Privacy in Consensus + Obfuscation

    No full text
    We examine the interplay between learning and privacy over multiagent consensus networks. The learning objective of each individual agent consists of computing some global network statistic, and is accomplished by means of a consensus protocol. The privacy objective consists of preventing inference of the individual agents' data from the information exchanged during the consensus stages, and is accomplished by adding some artificial noise to the observations (obfuscation). An analytical characterization of the learning and privacy performance is provided, with reference to a consensus perturbing and to a consensus-preserving obfuscation strategy
    corecore