324,799 research outputs found

    Identifying privacy risks in distributed data services:A model-driven approach

    Get PDF
    Online services are becoming increasingly data-centric; they collect, process, analyze and anonymously disclose growing amounts of personal data. It is crucial that such systems are engineered in a privacy-aware manner in order to satisfy both the privacy requirements of the user, and the legal privacy regulations that the system operates under. How can system developers be better supported to create privacy-aware systems and help them to understand and identify privacy risks? Model-Driven Engineering (MDE) offers a principled approach to engineer systems software. The capture of shared domain knowledge in models and corresponding tool support can increase the developers' understanding. In this paper, we argue for the application of MDE approaches to engineer privacy-aware systems. We present a general purpose privacy model and methodology that can be used to analyse and identify privacy risks in systems that comprise both access control and data pseudonymization enforcement technologies. We evaluate this method using a case-study based approach and show how the model can be applied to engineer privacy-aware systems and privacy policies that reduce the risk of unintended disclosure

    On trust and privacy in context-aware systems

    Get PDF
    Recent advances in networking, handheld computing and sensors technologies have led to the emergence of context-aware systems. The vast amounts of personal information collected by such systems has led to growing concerns about the privacy of their users. Users concerned about their private information are likely to refuse participation in such systems. Therefore, it is quite clear that for any context-aware system to be acceptable by the users, mechanisms for controlling access to personal information are a necessity. According to Alan Westin "privacy is the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information is communicated to others"1. Within this context we can classify users as either information owners or information receivers. It is also acknowledged that information owners are willing to disclose personal information if this disclosure is potentially beneficial. So, the acceptance of any context-aware system depends on the provision of mechanisms for fine-grained control of the disclosure of personal information incorporating an explicit notion of benefit

    Privacy-Aware Processing of Biometric Templates by Means of Secure Two-Party Computation

    Get PDF
    The use of biometric data for person identification and access control is gaining more and more popularity. Handling biometric data, however, requires particular care, since biometric data is indissolubly tied to the identity of the owner hence raising important security and privacy issues. This chapter focuses on the latter, presenting an innovative approach that, by relying on tools borrowed from Secure Two Party Computation (STPC) theory, permits to process the biometric data in encrypted form, thus eliminating any risk that private biometric information is leaked during an identification process. The basic concepts behind STPC are reviewed together with the basic cryptographic primitives needed to achieve privacy-aware processing of biometric data in a STPC context. The two main approaches proposed so far, namely homomorphic encryption and garbled circuits, are discussed and the way such techniques can be used to develop a full biometric matching protocol described. Some general guidelines to be used in the design of a privacy-aware biometric system are given, so as to allow the reader to choose the most appropriate tools depending on the application at hand

    Keeping Context In Mind: Automating Mobile App Access Control with User Interface Inspection

    Full text link
    Recent studies observe that app foreground is the most striking component that influences the access control decisions in mobile platform, as users tend to deny permission requests lacking visible evidence. However, none of the existing permission models provides a systematic approach that can automatically answer the question: Is the resource access indicated by app foreground? In this work, we present the design, implementation, and evaluation of COSMOS, a context-aware mediation system that bridges the semantic gap between foreground interaction and background access, in order to protect system integrity and user privacy. Specifically, COSMOS learns from a large set of apps with similar functionalities and user interfaces to construct generic models that detect the outliers at runtime. It can be further customized to satisfy specific user privacy preference by continuously evolving with user decisions. Experiments show that COSMOS achieves both high precision and high recall in detecting malicious requests. We also demonstrate the effectiveness of COSMOS in capturing specific user preferences using the decisions collected from 24 users and illustrate that COSMOS can be easily deployed on smartphones as a real-time guard with a very low performance overhead.Comment: Accepted for publication in IEEE INFOCOM'201

    Permissions Snapshots: Assessing Users' Adaptation to the Android Runtime Permission Model

    Get PDF
    The Android operating system changed its security and privacy-related permission model recently, offering its users the ability to control resources that applications are allowed to access on their devices. This major change to the traditional coarse-grained permission system was anticipated for a long time by privacy-aware users. This paper presents the first study that analyzes Android users' adaptation to the fine-grained runtime permission model, regarding their security and privacy controls. We gathered anonymous data from 50 participants who downloaded our application and answered questions related to the new permission model. The results indicate that the majority of users prefer the new model. We also collected data that demonstrate users' security controls at the given time. Our analysis shows that individuals make consistent choices regarding the resources they allow to various applications to access

    A Privacy-aware Data Access System for Automotive Applications

    Get PDF
    The introduction of Information technology (IT) in modern vehicles enables a plethora of new applications ranging from value added services up to autonomous driving vehicles. However, this also introduces new threats with regard to IT security and privacy. In this paper, we discuss the new privacy issues and propose a privacy-aware data access system for automotive applications. Our system informs the user over all privacy aspects and enables him to control third-party access to his personal data. We developed an easily usable human machine interface (HMI) and an underlying policy system to control data flows which is compliant to the European General Data Protection Regulation (GDPR). Our system can be easily integrated in future automotive architectures

    A PRIVACY MANAGEMENT ARCHITECTURE FOR PATIENT-CONTROLLED PERSONAL HEALTH RECORD SYSTEM

    Get PDF
    Patient-controlled personal health record systems can help make health care safer, cheaper, and more convenient by facilitating patients to 1) grant any care provider access to their complete personal health records anytime from anywhere, 2) avoid repeated tests and 3) control their privacy transparently. In this paper, we present the architecture of our Privacy-aware Patient-controlled Personal Health Record (P3HR) system through which a patient can view her integrated health history, and share her health information transparently with others (e.g., healthcare providers). Access to the health information of a particular patient is completely controlled by that patient. We also carry out intuitive security and privacy analysis of the P3HR system architecture considering different types of security attacks. Finally, we describe a prototype implementation of the P3HR system that we developed reflecting the special view of Japanese society. The most important advantage of P3HR system over other existing systems is that most likely P3HR system provides complete privacy protection without losing data accuracy. Unlike traditional partially anonymous health records (e.g., using k-anonymity or l-diversity), the health records in P3HR are closer to complete anonymity, and yet preserve data accuracy. Our approach makes it very unlikely that patients could be identified by an attacker from their anonymous health records in the P3HR system

    Enforcing privacy via access control and data perturbation.

    Get PDF
    With the increasing availability of large collections of personal and sensitive information to a wide range of user communities, services should take more responsibility for data privacy when disseminating information, which requires data sharing control. In most cases, data are stored in a repository at the site of the domain server, which takes full responsibility for their management. The data can be provided to known recipients, or published without restriction on recipients. To ensure that such data is used without breaching privacy, proper access control models and privacy protection methods are needed. This thesis presents an approach to protect personal and sensitive information that is stored on one or more data servers. There are three main privacy requirements that need to be considered when designing a system for privacy-preserving data access. The first requirement is privacy-aware access control. In traditional privacy-aware contexts, built-in conditions or granular access control are used to assign user privileges at a fine-grained level. Very frequently, users and their privileges are diverse. Hence, it is necessary to deploy proper access control on both subject and object servers that impose the conditions on carrying out user operations. This thesis defines a dual privacy-aware access control model, consisting of a subject server that manages user privileges and an object server that deals with granular data. Both servers extract user operations and server conditions from the original requests and convert them to privacy labels that contain access control attributes. In cross-domain cases, traditional solutions adopt roaming tables to support multiple-domain access. However, building roaming tables for all domains is costly and maintaining these tables can become an issue. Furthermore, when roaming occurs, the party responsible for multi-domain data management has to be clearly identified. In this thesis, a roaming adjustment mechanism is presented for both subject and object servers. By defining such a dual server control model and request process flow, the responsibility for data administration can be properly managed. The second requirement is the consideration of access purpose, namely why the subject requests access to the object and how the subject is going to use the object. The existing solutions overlook the different interpretations of purposes in distinct domains. This thesis proposes a privilege-oriented, purpose-based method that enhances the privacy-aware access control model mentioned in the previous paragraph. It includes a component that interprets the subject's intention and the conditions imposed by the servers on operations; and a component that caters for object types and object owner's intention. The third requirement is maintaining data utility while protecting privacy when data are shared without restriction on recipients. Most existing approaches achieve a high level of privacy at the expense of data usability. To the best of our knowledge, there is no solution that is able to keep both. This thesis combines data privacy protection with data utility by building a framework that defines a privacy protection process flow. It also includes two data privacy protection algorithms that are based on Chebyshev polynomials and fractal sequences, respectively. Experiments show that the both algorithms are resistant to two main data privacy attacks, but with little loss of accuracy
    • …
    corecore