444,290 research outputs found

    Machine-Readable Privacy Certificates for Services

    Full text link
    Privacy-aware processing of personal data on the web of services requires managing a number of issues arising both from the technical and the legal domain. Several approaches have been proposed to matching privacy requirements (on the clients side) and privacy guarantees (on the service provider side). Still, the assurance of effective data protection (when possible) relies on substantial human effort and exposes organizations to significant (non-)compliance risks. In this paper we put forward the idea that a privacy certification scheme producing and managing machine-readable artifacts in the form of privacy certificates can play an important role towards the solution of this problem. Digital privacy certificates represent the reasons why a privacy property holds for a service and describe the privacy measures supporting it. Also, privacy certificates can be used to automatically select services whose certificates match the client policies (privacy requirements). Our proposal relies on an evolution of the conceptual model developed in the Assert4Soa project and on a certificate format specifically tailored to represent privacy properties. To validate our approach, we present a worked-out instance showing how privacy property Retention-based unlinkability can be certified for a banking financial service.Comment: 20 pages, 6 figure

    Making GDPR Usable: A Model to Support Usability Evaluations of Privacy

    Full text link
    We introduce a new model for evaluating privacy that builds on the criteria proposed by the EuroPriSe certification scheme by adding usability criteria. Our model is visually represented through a cube, called Usable Privacy Cube (or UP Cube), where each of its three axes of variability captures, respectively: rights of the data subjects, privacy principles, and usable privacy criteria. We slightly reorganize the criteria of EuroPriSe to fit with the UP Cube model, i.e., we show how EuroPriSe can be viewed as a combination of only rights and principles, forming the two axes at the basis of our UP Cube. In this way we also want to bring out two perspectives on privacy: that of the data subjects and, respectively, that of the controllers/processors. We define usable privacy criteria based on usability goals that we have extracted from the whole text of the General Data Protection Regulation. The criteria are designed to produce measurements of the level of usability with which the goals are reached. Precisely, we measure effectiveness, efficiency, and satisfaction, considering both the objective and the perceived usability outcomes, producing measures of accuracy and completeness, of resource utilization (e.g., time, effort, financial), and measures resulting from satisfaction scales. In the long run, the UP Cube is meant to be the model behind a new certification methodology capable of evaluating the usability of privacy, to the benefit of common users. For industries, considering also the usability of privacy would allow for greater business differentiation, beyond GDPR compliance.Comment: 41 pages, 2 figures, 1 table, and appendixe

    A Flexible Network Approach to Privacy of Blockchain Transactions

    Full text link
    For preserving privacy, blockchains can be equipped with dedicated mechanisms to anonymize participants. However, these mechanism often take only the abstraction layer of blockchains into account whereas observations of the underlying network traffic can reveal the originator of a transaction request. Previous solutions either provide topological privacy that can be broken by attackers controlling a large number of nodes, or offer strong and cryptographic privacy but are inefficient up to practical unusability. Further, there is no flexible way to trade privacy against efficiency to adjust to practical needs. We propose a novel approach that combines existing mechanisms to have quantifiable and adjustable cryptographic privacy which is further improved by augmented statistical measures that prevent frequent attacks with lower resources. This approach achieves flexibility for privacy and efficency requirements of different blockchain use cases.Comment: 6 pages, 2018 IEEE 38th International Conference on Distributed Computing Systems (ICDCS

    Stop the Abuse of Gmail!

    Get PDF
    Gmail, a highly anticipated webmail application made by Google, has been criticized by privacy advocates for breaching wiretapping laws, even before its release from beta testing. Gmail\u27s large storage space and automated processes developed to scan the content of incoming messages and create advertisements based on the scanned terms have enraged privacy groups on an international level. This iBrief will compare Gmail\u27s practices with its peers and conclude that its practices and procedures are consistent with the standards of the webmail industry. The iBrief will then propose additional measures Gmail could institute to further protect webmail users\u27 and alleviate the concerns of privacy advocates

    The challenge to privacy from ever increasing state surveillance: a comparative perspective

    Get PDF
    This article explores how internet surveillance in the name of counterterrorism challenges privacy. Introduction International terrorism poses serious threats to the societies it affects. The counter-terrorism measures adopted since 2001 have sought to limit the advance of terrorism but, in the process, also created enormous challenges for (transnational) constitutionalism. Long-held and cherished principles relating to democracy, the rule of law and the protection of a wide range of human rights have come under increasing strain. Legislative authority to shoot down hijacked aircrafts or to use lethal drones against suspected terrorists affect the right to life; waterboarding of prisoners and other inhumane practices contravene the prohibition of torture; extraordinary renditions and black sites circumvent constitutionally protected rights and processes, including the right to freedom and security, the right to a fair trial and due process for suspected terrorists; ill-defined terrorism offences undermine the rule of law and personal freedom; blanket suspicion of Muslims as terror sympathisers impacts on freedom of religion and leads to unfair discrimination; and mass surveillance of communication sweeps away the right to privacy. This article explores how internet surveillance in the name of counterterrorism challenges privacy. In Part II, the article analyses the international dimension of counter-terrorism measures and the conceptualisation of data protection and privacy in the European Union (‘EU’), the United States of America (‘US’) and Australia. Part III compares the different concepts of data protection and privacy, and explores the prospects of an international legal framework for the protection of privacy. Part IV concludes that work on international data protection and privacy standards, while urgently needed, remains a long-term vision with particularly uncertain prospects as far as antiterrorism and national security measures are concerned

    Information Extraction Under Privacy Constraints

    Full text link
    A privacy-constrained information extraction problem is considered where for a pair of correlated discrete random variables (X,Y)(X,Y) governed by a given joint distribution, an agent observes YY and wants to convey to a potentially public user as much information about YY as possible without compromising the amount of information revealed about XX. To this end, the so-called {\em rate-privacy function} is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from YY under a privacy constraint between XX and the extracted information, where privacy is measured using either mutual information or maximal correlation. Properties of the rate-privacy function are analyzed and information-theoretic and estimation-theoretic interpretations of it are presented for both the mutual information and maximal correlation privacy measures. It is also shown that the rate-privacy function admits a closed-form expression for a large family of joint distributions of (X,Y)(X,Y). Finally, the rate-privacy function under the mutual information privacy measure is considered for the case where (X,Y)(X,Y) has a joint probability density function by studying the problem where the extracted information is a uniform quantization of YY corrupted by additive Gaussian noise. The asymptotic behavior of the rate-privacy function is studied as the quantization resolution grows without bound and it is observed that not all of the properties of the rate-privacy function carry over from the discrete to the continuous case.Comment: 55 pages, 6 figures. Improved the organization and added detailed literature revie
    • 

    corecore