83,905 research outputs found

    When the bureaucrat promises to safeguard your online privacy: Dissecting the contents of privacy statements on Dutch municipal websites

    Get PDF
    Various studies show that the display of a privacy statement on an organization's website can be a potent, but simple way of acquiring clients' and users' trust, which results in the completion of transactions with the organization through its website. Empirical studies that analyze the contents of privacy statements on commercial websites are profuse, while privacy statements posted on the websites of non-commercial organizations have been largely ignored by researchers. In this study, the contents of privacy statements on Dutch municipal websites are analyzed. Using the important provisions of the Wet Bescherming Persoonsgegevens (WBP) or the Dutch Personal Data Protection Act, the study also looked into the conformity of the contents of privacy statements with the existing law on privacy protection in the Netherlands. We also looked into the availability and findability of privacy statements on Dutch municipal websites. Three important findings resulted from this study: first, not all municipal websites bother to post privacy statements on their websites; second, most municipalities do not ensure that their online privacy statements are findable; and third, privacy statements on Dutch municipal websites emphasize diverging assurances and promises—with some privacy policies containing all the important provisions of the WBP, and others offering only general, and sometimes rather vague, guarantees

    Using the blockchain to enable transparent and auditable processing of personal data in cloud- based services: Lessons from the Privacy-Aware Cloud Ecosystems (PACE) project

    Get PDF
    The architecture of cloud-based services is typically opaque and intricate. As a result, data subjects cannot exercise adequate control over their personal data, and overwhelmed data protection authorities must spend their limited resources in costly forensic efforts to ascertain instances of non-compliance. To address these data protection challenges, a group of computer scientists and socio-legal scholars joined forces in the Privacy-Aware Cloud Ecosystems (PACE) project to design a blockchain-based privacy-enhancing technology (PET). This article presents the fruits of this collaboration, highlighting the capabilities and limits of our PET, as well as the challenges we encountered during our interdisciplinary endeavour. In particular, we explore the barriers to interdisciplinary collaboration between law and computer science that we faced, and how these two fields’ different expectations as to what technology can do for data protection law compliance had an impact on the project's development and outcome. We also explore the overstated promises of techno-regulation, and the practical and legal challenges that militate against the implementation of our PET: most industry players have no incentive to deploy it, the transaction costs of running it make it prohibitively expensive, and there are significant clashes between the blockchain's decentralised architecture and GDPR's requirements that hinder its deployability. We share the insights and lessons we learned from our efforts to overcome these challenges, hoping to inform other interdisciplinary projects that are increasingly important to shape a data ecosystem that promotes the protection of our personal data

    Peeling Back the Student Privacy Pledge

    Get PDF
    Education software is a multi-billion dollar industry that is rapidly growing. The federal government has encouraged this growth through a series of initiatives that reward schools for tracking and aggregating student data. Amid this increasingly digitized education landscape, parents and educators have begun to raise concerns about the scope and security of student data collection. Industry players, rather than policymakers, have so far led efforts to protect student data. Central to these efforts is the Student Privacy Pledge, a set of standards that providers of digital education services have voluntarily adopted. By many accounts, the Pledge has been a success. Since its introduction in 2014, over 300 companies have signed on, indicating widespread commitment to the Pledge’s seemingly broad protections for student privacy. This industry participation is encouraging, but the Pledge does not contain any meaningful oversight or enforcement provisions. This Article analyzes whether signatory companies are actually complying with the Pledge rather than just paying lip service to its goals. By looking to the privacy policies and terms of service of a sample of the Pledge’s signatories, I conclude that noncompliance may be a significant and prevalent issue. Consumers of education software have some power to hold signatories accountable, but their oversight abilities are limited. This Article argues that the federal government, specifically the Federal Trade Commission, is best positioned to enforce compliance with the Pledge and should hold Pledge signatories to their promises

    The Exploitation of Web Navigation Data: Ethical Issues and Alternative Scenarios

    Get PDF
    Nowadays, the users' browsing activity on the Internet is not completely private due to many entities that collect and use such data, either for legitimate or illegal goals. The implications are serious, from a person who exposes unconsciously his private information to an unknown third party entity, to a company that is unable to control its information to the outside world. As a result, users have lost control over their private data in the Internet. In this paper, we present the entities involved in users' data collection and usage. Then, we highlight what are the ethical issues that arise for users, companies, scientists and governments. Finally, we present some alternative scenarios and suggestions for the entities to address such ethical issues.Comment: 11 pages, 1 figur

    Peeling Back the Student Privacy Pledge

    Get PDF
    Education software is a multi-billion dollar industry that is rapidly growing. The federal government has encouraged this growth through a series of initiatives that reward schools for tracking and aggregating student data. Amid this increasingly digitized education landscape, parents and educators have begun to raise concerns about the scope and security of student data collection. Industry players, rather than policymakers, have so far led efforts to protect student data. Central to these efforts is the Student Privacy Pledge, a set of standards that providers of digital education services have voluntarily adopted. By many accounts, the Pledge has been a success. Since its introduction in 2014, over 300 companies have signed on, indicating widespread commitment to the Pledge’s seemingly broad protections for student privacy. This industry participation is encouraging, but the Pledge does not contain any meaningful oversight or enforcement provisions. This Article analyzes whether signatory companies are actually complying with the Pledge rather than just paying lip service to its goals. By looking to the privacy policies and terms of service of a sample of the Pledge’s signatories, I conclude that noncompliance may be a significant and prevalent issue. Consumers of education software have some power to hold signatories accountable, but their oversight abilities are limited. This Article argues that the federal government, specifically the Federal Trade Commission, is best positioned to enforce compliance with the Pledge and should hold Pledge signatories to their promises

    Privacy in an Ambient World

    Get PDF
    Privacy is a prime concern in today's information society. To protect\ud the privacy of individuals, enterprises must follow certain privacy practices, while\ud collecting or processing personal data. In this chapter we look at the setting where an\ud enterprise collects private data on its website, processes it inside the enterprise and\ud shares it with partner enterprises. In particular, we analyse three different privacy\ud systems that can be used in the different stages of this lifecycle. One of them is the\ud Audit Logic, recently introduced, which can be used to keep data private when it\ud travels across enterprise boundaries. We conclude with an analysis of the features\ud and shortcomings of these systems

    Slave to the Algorithm? Why a \u27Right to an Explanation\u27 Is Probably Not the Remedy You Are Looking For

    Get PDF
    Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers\u27 worries of intellectual property or trade secrets disclosure. Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ( right to be forgotten ) and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered
    • …
    corecore