8,451 research outputs found

    Unilateral Invasions of Privacy

    Get PDF
    Most people seem to agree that individuals have too little privacy, and most proposals to address that problem focus on ways to give those users more information about, and more control over, how information about them is used. Yet in nearly all cases, information subjects are not the parties who make decisions about how information is collected, used, and disseminated; instead, outsiders make unilateral decisions to collect, use, and disseminate information about others. These potential privacy invaders, acting without input from information subjects, are the parties to whom proposals to protect privacy must be directed. This Article develops a theory of unilateral invasions of privacy rooted in the incentives of potential outside invaders. It first briefly describes the different kinds of information flows that can result in losses of privacy and the private costs and benefits to the participants in these information flows. It argues that in many cases the relevant costs and benefits are those of an outsider deciding whether certain information flows occur. These outside invaders are more likely to act when their own private costs and benefits make particular information flows worthwhile, regardless of the effects on information subjects or on social welfare. And potential privacy invaders are quite sensitive to changes in these costs and benefits, unlike information subjects, for whom transaction costs can overwhelm incentives to make information more or less private. The Article then turns to privacy regulation, arguing that this unilateral-invasion theory sheds light on how effective privacy regulations should be designed. Effective regulations are those that help match the costs and benefits faced by a potential privacy invader with the costs and benefits to society of a given information flow. Law can help do so by raising or lowering the costs or benefits of a privacy invasion, but only after taking account of other costs and benefits faced by the potential privacy invader

    Big Brother is Listening to You: Digital Eavesdropping in the Advertising Industry

    Get PDF
    In the Digital Age, information is more accessible than ever. Unfortunately, that accessibility has come at the expense of privacy. Now, more and more personal information is in the hands of corporations and governments, for uses not known to the average consumer. Although these entities have long been able to keep tabs on individuals, with the advent of virtual assistants and “always-listening” technologies, the ease by which a third party may extract information from a consumer has only increased. The stark reality is that lawmakers have left the American public behind. While other countries have enacted consumer privacy protections, the United States has no satisfactory legal framework in place to curb data collection by greedy businesses or to regulate how those companies may use and protect consumer data. This Article contemplates one use of that data: digital advertising. Inspired by stories of suspiciously well-targeted advertisements appearing on social media websites, this Article additionally questions whether companies have been honest about their collection of audio data. To address the potential harms consumers may suffer as a result of this deficient privacy protection, this Article proposes a framework wherein companies must acquire users\u27 consent and the government must ensure that businesses do not use consumer information for harmful purposes

    Slave to the Algorithm? Why a \u27Right to an Explanation\u27 Is Probably Not the Remedy You Are Looking For

    Get PDF
    Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers\u27 worries of intellectual property or trade secrets disclosure. Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ( right to be forgotten ) and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered

    Data-driven personalisation and the law - a primer: collective interests engaged by personalisation in markets, politics and law

    Get PDF
    Interdisciplinary Workshop on �Data-Driven Personalisation in Markets, Politics and Law' on 28 June 2019Southampton Law School will be hosting an interdisciplinary workshop on the topic of �Data-Driven Personalisation in Markets, Politics and Law' on Friday 28 June 2019, which will explore the pervasive and growing phenomenon of �personalisation� � from behavioural advertising in commerce and micro-targeting in politics, to personalised pricing and contracting and predictive policing and recruitment. This is a huge area which touches upon many legal disciplines as well as social science concerns and, of course, computer science and mathematics. Within law, it goes well beyond data protection law, raising questions for criminal law, consumer protection, competition and IP law, tort law, administrative law, human rights and anti-discrimination law, law and economics as well as legal and constitutional theory. We�ve written a position paper, https://eprints.soton.ac.uk/428082/1/Data_Driven_Personalisation_and_the_Law_A_Primer.pdf which is designed to give focus and structure to a workshop that we expect will be strongly interdisciplinary, creative, thought-provoking and entertaining. We like to hear your thoughts! Call for papers! Should you be interested in disagreeing, elaborating, confirming, contradicting, dismissing or just reflecting on anything in the paper and present those ideas at the workshop, send us an abstract by Friday 5 April 2019 (Ms Clare Brady [email protected] ). We aim to publish an edited popular law/social science book with the most compelling contributions after the workshop.Prof Uta Kohl, Prof James Davey, Dr Jacob Eisler<br/

    I Know What You Will Do Next Summer: Informational Privacy and the Ethics of Data Analytics

    Get PDF

    Privacy and the Internet of Things: Why Changing Expectations Demand Heightened Standards

    Get PDF
    Entertainment consoles, wearable monitors, and security systems. For better or worse, internet-connected devices are revolutionizing the consumer products industry. Referred to broadly as the Internet of Things (IoT), this ‘smart’ technology is drastically increasing the means, scope, and frequency by which individuals communicate their personal information. This Note explores the disruptive impact of IoT consumer devices on the U.S.’s patchwork system of privacy protections. After presenting a high-level survey of several key regulatory issues, this Note argues that the proliferation of IoT devices exposes a fundamental flaw in the Katz “reasonable expectation of privacy” standard. As individual expectations of privacy rapidly and inevitably deteriorate, societal norms will follow suit, resulting in a Fourth Amendment standard, which is incompatible and outdated in this new, interconnected reality

    Third Party Tracking in the Mobile Ecosystem

    Full text link
    Third party tracking allows companies to identify users and track their behaviour across multiple digital services. This paper presents an empirical study of the prevalence of third-party trackers on 959,000 apps from the US and UK Google Play stores. We find that most apps contain third party tracking, and the distribution of trackers is long-tailed with several highly dominant trackers accounting for a large portion of the coverage. The extent of tracking also differs between categories of apps; in particular, news apps and apps targeted at children appear to be amongst the worst in terms of the number of third party trackers associated with them. Third party tracking is also revealed to be a highly trans-national phenomenon, with many trackers operating in jurisdictions outside the EU. Based on these findings, we draw out some significant legal compliance challenges facing the tracking industry.Comment: Corrected missing company info (Linkedin owned by Microsoft). Figures for Microsoft and Linkedin re-calculated and added to Table

    Protecting Consumers in the Age of the Internet of Things

    Get PDF
    (Excerpt) IoT devices are an ever-increasing force of nature in our daily lives. They provide a multitude of essential benefits that we as a society have come to rely on. Thus, IoT devices are likely to continue to become irreplaceable tools. With the many benefits that these devices bring, they also bring a vast array of privacy and security issues that our society has not had to face until recently. Because of the new and prevalent risks associated with the IoT and because of the increasing harms to consumers, it is time for Congress to enact an IoT-specific data privacy and security law. Some of the provisions that Congress should consider including in such a law are reasonable security measures, notice and consent, data breach notification, a private right of action, and constraints on the way that manufacturers use and store consumer data

    Tell the Smart House to Mind its Own Business!: Maintaining Privacy and Security in the Era of Smart Devices

    Get PDF
    Consumers want convenience. That convenience often comes in the form of everyday smart devices that connect to the internet and assist with daily tasks. With the advancement of technology and the “Internet of Things” in recent years, convenience is at our fingertips more than ever before. Not only do consumers want convenience, they want to trust that their product is performing the task that they purchased it for and not exposing them to danger or risk. However, due to the increasing capabilities and capacities of smart devices, consumers are less likely to realize the implications of what they are agreeing to when they purchase and begin using these products. This Note will focus on the risks associated with smart devices, using smart home devices as an illustration. These devices have the ability to collect intimate details about the layout of the home and about those who live within it. The mere collection of this personal data opens consumers up to the risk of having their private information shared with unintended recipients whether the information is being sold to a third party or accessible to a hacker. Thus, to adequately protect consumers, it is imperative that they can fully consent to their data being collected, retained, and potentially distributed. This Note examines the law that is currently in place to protect consumers who use smart devices and argues that a void ultimately leaves consumers vulnerable. Current data privacy protection in the United States centers on the self-regulatory regime of “notice and choice.” This Note highlights how the self-regulatory notice-and-choice model fails to ensure sufficient protection for consumers who use smart devices and discusses the need for greater privacy protection in the era of the emerging Internet of Things. Ultimately, this Note proposes a state-level resolution and calls upon an exemplar state to experiment with privacy protection laws to determine the best way to regulate the Internet of Things
    • …
    corecore