95,350 research outputs found

    Protecting Patient Privacy: Strategies for Regulating Electronic Health Records Exchange

    Get PDF
    The report offers policymakers 10 recommendations to protect patient privacy as New York state develops a centralized system for sharing electronic medical records. Those recommendations include:Require that the electronic systems employed by HIEs have the capability to sort and segregate medical information in order to comply with guaranteed privacy protections of New York and federal law. Presently, they do not.Offer patients the right to opt-out of the system altogether. Currently, people's records can be uploaded to the system without their consent.Require that patient consent forms offer clear information-sharing options. The forms should give patients three options: to opt-in and allow providers access to their electronic medical records, to opt-out except in the event of a medical emergency, or to opt-out altogether.Prohibit and sanction the misuse of medical information. New York must protect patients from potential bad actors--that small minority of providers who may abuse information out of fear, prejudice or malice.Prohibit the health information-sharing networks from selling data. The State Legislature should pass legislation prohibiting the networks from selling patients' private health information

    PRUDEnce: A system for assessing privacy risk vs utility in data sharing ecosystems

    Get PDF
    Data describing human activities are an important source of knowledge useful for understanding individual and collective behavior and for developing a wide range of user services. Unfortunately, this kind of data is sensitive, because people’s whereabouts may allow re-identification of individuals in a de-identified database. Therefore, Data Providers, before sharing those data, must apply any sort of anonymization to lower the privacy risks, but they must be aware and capable of controlling also the data quality, since these two factors are often a trade-off. In this paper we propose PRUDEnce (Privacy Risk versus Utility in Data sharing Ecosystems), a system enabling a privacy-aware ecosystem for sharing personal data. It is based on a methodology for assessing both the empirical (not theoretical) privacy risk associated to users represented in the data, and the data quality guaranteed only with users not at risk. Our proposal is able to support the Data Provider in the exploration of a repertoire of possible data transformations with the aim of selecting one specific transformation that yields an adequate trade-off between data quality and privacy risk. We study the practical effectiveness of our proposal over three data formats underlying many services, defined on real mobility data, i.e., presence data, trajectory data and road segment data

    A Privacy Protection Mechanism for Mobile Online Social Networks

    Full text link
    A Location sharing system is the most critical component in mobile online social networks (MOSNS).Huge number of user\u27s location information will be stored by the service providers. In addition to the location privacy and social network privacy cannot be guaranteed to the user in the earlier work. Regarding the enhanced privacy against the inside attacker implemented by the service provider in (MOSNS), we initiate a new architecture with multiple servers .It introduces a protected solution which supports a location sharing among friends and strangers. The user friend set in each query is submitted to the location server it divides into multiple subset by the location server. If the user makes a query to the server the data can be retrieved only for the registered users instead of all. We use Three Layer of Security likely, High, Medium and Low for the Privacy implementation. Simultaneously with a location sharing it offers check ability of the searching results reoccurred from the servers. We also prove that the new construction is safe under the stronger security model with enhanced privacy

    Immutable Autobiography of Smart Cars Leveraging Blockchain Technology

    Get PDF
    The popularity of smart cars is increasing around the world as they offer a wide range of services and conveniences. These smart cars are equipped with a variety of sensors generating a large amount of data, many of which are critical. Besides, there are multiple parties involved in the lifespan of a smart car, such as manufacturers, car owners, government agencies, and third-party service providers who also generate data about the vehicle. In addition to managing and sharing data amongst these entities in a secure and privacy-friendly way which is a great challenge itself, there exists a trust deficit about some types of data as they remain under the custody of the car owner (e.g. satellite navigation and mileage data) and can easily be manipulated. In this paper, we propose a blockchain assisted architecture enabling the owner of a smart car to create an immutable record of every data, called the autobiography of a car, generated within its lifespan. We also explain how the trust about this record is guaranteed by the immutability characteristic of the blockchain. Furthermore, the paper describes how the proposed architecture enables a secure and privacy-preserving mechanism for sharing of smart car data among different parties

    Data-Driven and Game-Theoretic Approaches for Privacy

    Get PDF
    abstract: In the past few decades, there has been a remarkable shift in the boundary between public and private information. The application of information technology and electronic communications allow service providers (businesses) to collect a large amount of data. However, this ``data collection" process can put the privacy of users at risk and also lead to user reluctance in accepting services or sharing data. This dissertation first investigates privacy sensitive consumer-retailers/service providers interactions under different scenarios, and then focuses on a unified framework for various information-theoretic privacy and privacy mechanisms that can be learned directly from data. Existing approaches such as differential privacy or information-theoretic privacy try to quantify privacy risk but do not capture the subjective experience and heterogeneous expression of privacy-sensitivity. The first part of this dissertation introduces models to study consumer-retailer interaction problems and to better understand how retailers/service providers can balance their revenue objectives while being sensitive to user privacy concerns. This dissertation considers the following three scenarios: (i) the consumer-retailer interaction via personalized advertisements; (ii) incentive mechanisms that electrical utility providers need to offer for privacy sensitive consumers with alternative energy sources; (iii) the market viability of offering privacy guaranteed free online services. We use game-theoretic models to capture the behaviors of both consumers and retailers, and provide insights for retailers to maximize their profits when interacting with privacy sensitive consumers. Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. In the second part, a novel context-aware privacy framework called generative adversarial privacy (GAP) is introduced. Inspired by recent advancements in generative adversarial networks, GAP allows the data holder to learn the privatization mechanism directly from the data. Under GAP, finding the optimal privacy mechanism is formulated as a constrained minimax game between a privatizer and an adversary. For appropriately chosen adversarial loss functions, GAP provides privacy guarantees against strong information-theoretic adversaries. Both synthetic and real-world datasets are used to show that GAP can greatly reduce the adversary's capability of inferring private information at a small cost of distorting the data.Dissertation/ThesisDoctoral Dissertation Electrical Engineering 201

    Locally Differentially Private Gradient Tracking for Distributed Online Learning over Directed Graphs

    Full text link
    Distributed online learning has been proven extremely effective in solving large-scale machine learning problems over streaming data. However, information sharing between learners in distributed learning also raises concerns about the potential leakage of individual learners' sensitive data. To mitigate this risk, differential privacy, which is widely regarded as the "gold standard" for privacy protection, has been widely employed in many existing results on distributed online learning. However, these results often face a fundamental tradeoff between learning accuracy and privacy. In this paper, we propose a locally differentially private gradient tracking based distributed online learning algorithm that successfully circumvents this tradeoff. We prove that the proposed algorithm converges in mean square to the exact optimal solution while ensuring rigorous local differential privacy, with the cumulative privacy budget guaranteed to be finite even when the number of iterations tends to infinity. The algorithm is applicable even when the communication graph among learners is directed. To the best of our knowledge, this is the first result that simultaneously ensures learning accuracy and rigorous local differential privacy in distributed online learning over directed graphs. We evaluate our algorithm's performance by using multiple benchmark machine-learning applications, including logistic regression of the "Mushrooms" dataset and CNN-based image classification of the "MNIST" and "CIFAR-10" datasets, respectively. The experimental results confirm that the proposed algorithm outperforms existing counterparts in both training and testing accuracies.Comment: 21 pages, 4 figure

    A privacy-friendly gaming framework in smart electricity and water grids

    Get PDF
    Serious games can be used to push consumers of common-pool resources toward socially responsible consumption patterns. However, gamified interactions can result in privacy leaks and potential misuses of player-provided data. In the Smart Grid ecosystem, a smart metering framework providing some basic cryptographic primitives can enable the implementation of serious games in a privacy-friendly manner. This paper presents a smart metering architecture in which the users have access to their own high-frequency data and can use them as the input data to a multi-party secure protocol. Authenticity and correctness of the data are guaranteed by the usage of a public blockchain. The framework enables a gaming platform to administer a set of team game activities aimed at promoting a more sustainable usage of energy and water. We discuss and assess the performance of a protocol based on Shamir secret sharing scheme, which enables the members of the teams to calculate their overall consumption and to compare it with those of other teams without disclosing individual energy usage data. Additionally, the protocol impedes that the game platform learns the meter readings of the players (either individual or aggregated) and their challenge objectives
    • …
    corecore