18,025 research outputs found

    Internet Localization of Multi-Party Relay Users: Inherent Friction Between Internet Services and User Privacy

    Full text link
    Internet privacy is increasingly important on the modern Internet. Users are looking to control the trail of data that they leave behind on the systems that they interact with. Multi-Party Relay (MPR) architectures lower the traditional barriers to adoption of privacy enhancing technologies on the Internet. MPRs are unique from legacy architectures in that they are able to offer privacy guarantees without paying significant performance penalties. Apple's iCloud Private Relay is a recently deployed MPR service, creating the potential for widespread consumer adoption of the architecture. However, many current Internet-scale systems are designed based on assumptions that may no longer hold for users of privacy enhancing systems like Private Relay. There are inherent tensions between systems that rely on data about users -- estimated location of a user based on their IP address, for example -- and the trend towards a more private Internet. This work studies a core function that is widely used to control network and application behavior, IP geolocation, in the context of iCloud Private Relay usage. We study the location accuracy of popular IP geolocation services compared against the published location dataset that Apple publicly releases to explicitly aid in geolocating PR users. We characterize geolocation service performance across a number of dimensions, including different countries, IP version, infrastructure provider, and time. Our findings lead us to conclude that existing approaches to IP geolocation (e.g., frequently updated databases) perform inadequately for users of the MPR architecture. For example, we find median location errors >1,000 miles in some countries for IPv4 addresses using IP2Location. Our findings lead us to conclude that new, privacy-focused, techniques for inferring user location may be required as privacy becomes a default user expectation on the Internet

    Privacy Perils of Open Data and Data Sharing: A Case Study of Taiwan\u27s Open Data Policy and Practices

    Get PDF
    Governments and private sector players have hopped on the open data train in the past few years. Both the governments and civil society in Taiwan are exploring the opportunities provided by the data stored in public and private sectors. While they have been enjoying the benefits of the sharing and flowing of data among various databases, the government and some players in the private sectors have also posed tremendous privacy challenges by inappropriately gathering and processing personal data. The amended Personal Data Protection Act was originally enacted as a regulatory mechanism to protect personal data and create economic benefits via enhancing the uses of public and private sector data. In reality, the Act has instead resulted in harm to Taiwan’s data privacy situation in this big data era. This article begins with an overview of the Taiwan’s open data policy history and its current practices. Next, the article analyzes cases in which the data sharing practices between different sectors have given rise to privacy controversies, with a particular focus on 2020, when Taiwan used data surveillance in response to the COVID-19 pandemic. Finally, this article flags problems related to an open data system, including the protection of sensitive data, de-identification, the right to consent and opt-out, and the ambiguity of “public interest,” and concludes by proposing a feasible architecture for the implementation of a more sensible open data system with privacy-enhancing characteristics

    Mandatory Enforcement of Privacy Policies using Trusted Computing Principles

    Get PDF
    Modern communication systems and information technology create significant new threats to information privacy. In this paper, we discuss the need for proper privacy protection in cooperative intelligent transportation systems (cITS), one instance of such systems. We outline general principles for data protection and their legal basis and argue why pure legal protection is insufficient. Strong privacy-enhancing technologies need to be deployed in cITS to protect user data while it is generated and processed. As data minimization cannot always prevent the need for disclosing relevant personal information, we introduce the new concept of mandatory enforcement of privacy policies. This concept empowers users and data subjects to tightly couple their data with privacy policies and rely on the system to impose such policies onto any data processors. We also describe the PRECIOSA Privacy-enforcing Runtime Architecture that exemplifies our approach. Moreover, we show how an application can utilize this architecture by applying it to a pay as you drive (PAYD) car insurance scenario

    Semi-Adversarial Networks: Convolutional Autoencoders for Imparting Privacy to Face Images

    Full text link
    In this paper, we design and evaluate a convolutional autoencoder that perturbs an input face image to impart privacy to a subject. Specifically, the proposed autoencoder transforms an input face image such that the transformed image can be successfully used for face recognition but not for gender classification. In order to train this autoencoder, we propose a novel training scheme, referred to as semi-adversarial training in this work. The training is facilitated by attaching a semi-adversarial module consisting of a pseudo gender classifier and a pseudo face matcher to the autoencoder. The objective function utilized for training this network has three terms: one to ensure that the perturbed image is a realistic face image; another to ensure that the gender attributes of the face are confounded; and a third to ensure that biometric recognition performance due to the perturbed image is not impacted. Extensive experiments confirm the efficacy of the proposed architecture in extending gender privacy to face images
    corecore