5,337 research outputs found

    Slave to the Algorithm? Why a \u27Right to an Explanation\u27 Is Probably Not the Remedy You Are Looking For

    Get PDF
    Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers\u27 worries of intellectual property or trade secrets disclosure. Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ( right to be forgotten ) and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered

    After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance

    Get PDF
    Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumers’ privacy expectations and thereby, become “over-privileged” to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem

    A Human-centric Perspective on Digital Consenting: The Case of GAFAM

    Get PDF
    According to different legal frameworks such as the European General Data Protection Regulation (GDPR), an end-user's consent constitutes one of the well-known legal bases for personal data processing. However, research has indicated that the majority of end-users have difficulty in understanding what they are consenting to in the digital world. Moreover, it has been demonstrated that marginalized people are confronted with even more difficulties when dealing with their own digital privacy. In this research, we use an enactivist perspective from cognitive science to develop a basic human-centric framework for digital consenting. We argue that the action of consenting is a sociocognitive action and includes cognitive, collective, and contextual aspects. Based on the developed theoretical framework, we present our qualitative evaluation of the consent-obtaining mechanisms implemented and used by the five big tech companies, i.e. Google, Amazon, Facebook, Apple, and Microsoft (GAFAM). The evaluation shows that these companies have failed in their efforts to empower end-users by considering the human-centric aspects of the action of consenting. We use this approach to argue that their consent-obtaining mechanisms violate principles of fairness, accountability and transparency. We then suggest that our approach may raise doubts about the lawfulness of the obtained consent—particularly considering the basic requirements of lawful consent within the legal framework of the GDPR

    Mobile Privacy and Business-to-Platform Dependencies: An Analysis of SEC Disclosures

    Get PDF
    This Article systematically examines the dependence of mobile apps on mobile platforms for the collection and use of personal information through an analysis of Securities and Exchange Commission (SEC) filings of mobile app companies. The Article uses these disclosures to find systematic evidence of how app business models are shaped by the governance of user data by mobile platforms, in order to reflect on the role of platforms in privacy regulation more generally. The analysis of SEC filings documented in the Article produces new and unique insights into the data practices and data-related aspects of the business models of popular mobile apps and shows the value of SEC filings for privacy law and policy research more generally. The discussion of SEC filings and privacy builds on regulatory developments in SEC disclosures and cybersecurity of the last decade. The Article also connects to recent regulatory developments in the U.S. and Europe, including the General Data Protection Regulation, the proposals for a new ePrivacy Regulation and a Regulation of fairness in business-to-platform relations

    In Defense of the Long Privacy Statement

    Get PDF

    The Color of Algorithms: An Analysis and Proposed Research Agenda for Deterring Algorithmic Redlining

    Get PDF
    • …
    corecore