168 research outputs found

    Social Data

    Get PDF
    As online social media grow, it is increasingly important to distinguish between the different threats to privacy that arise from the conversion of our social interactions into data. One well-recognized threat is from the robust concentrations of electronic information aggregated into colossal databases. Yet much of this same information is also consumed socially and dispersed through a user interface to hundreds, if not thousands, of peer users. In order to distinguish relationally shared information from the threat of the electronic database, this essay identifies the massive amounts of personal information shared via the user interface of social technologies as “social data.” The main thesis of this essay is that, unlike electronic databases, which are the focus of the Fair Information Practice Principles (FIPPs), there are no commonly accepted principles to guide the recent explosion of voluntarily adopted practices, industry codes, and laws that address social data. This essay aims to remedy that by proposing three social data principles — a sort of FIPPs for the front-end of social media: the Boundary Regulation Principle, the Identity Integrity Principle, and the Network Integrity Principle. These principles can help courts, policymakers, and organizations create more consistent and effective rules regarding the use of social data

    The Fight to Frame Privacy

    Get PDF
    In his important new book, Nothing to Hide: The False Tradeoff Between Privacy and Security, Daniel Solove argues that if we continue to view privacy and security as diametrically opposed to each other, privacy will always lose. Solove argues that the predetermined abandonment of privacy in security-related disputes means that the structure of the privacy-security debate is inherently flawed. Solove understands that privacy is far too vital to our freedom and democracy to accept its inevitable demise. The central thesis of this Review is that Solove\u27s polemic is a strong and desperately needed collection of frames that counterbalances the nothing to hide argument and other refrains so often used in privacy disputes. Nothing to Hide is succinct and accessible. In his ambitious quest to concisely respond to a wide range of problems, however, Solove risks leaving the reader unsatisfied, wanting more details about his proposals to untangle the tension between privacy and security. Yet this critique does not detract from the importance of this book as a collection of frames to counter a popular narrative in the privacy and security debate

    Reviving Implied Confidentiality

    Get PDF
    The law of online relationships has a significant flaw—it regularly fails to account for the possibility of an implied confidence. The established doctrine of implied confidentiality is, without explanation, almost entirely absent from online jurisprudence in environments where it has traditionally been applied offline, such as with sensitive data sets and intimate social interactions. Courts’ abandonment of implied confidentiality in online environments should have been foreseen. The concept has not been developed enough to be consistently applied in environments such as the Internet that lack obvious physical or contextual cues of confidence. This absence is significant because implied confidentiality could be the missing piece that helps resolve the problems caused by the disclosure of personal information on the Internet. This Article urges a revival of implied confidentiality by identifying from the relevant case law a set of implied confidentiality norms based upon party perception and inequality that courts should be, but are not, considering in online disputes. These norms are used to develop a framework for courts to better recognize implied agreements and relationships of trust in all contexts

    Website Design as Contract

    Get PDF

    Body Cameras and the Path to Redeem Privacy Law

    Get PDF
    From a privacy perspective, the movement towards police body cameras seems ominous. The prospect of a surveillance device capturing massive amounts of data concerning people’s most vulnerable moments is daunting. These concerns are compounded by the fact that there is little consensus and few hard rules on how and for whom these systems should be built and used. But in many ways, this blank slate is a gift. Law and policy makers are not burdened by the weight of rules and technologies created in a different time for a different purpose. These surveillance and data technologies will be modern. Many of the risks posed by the systems will be novel as well. Our privacy rules must keep up. In this Article, I argue that police body cameras are an opportunity to chart a path past privacy law’s most vexing missteps and omissions. Specifically, lawmakers should avoid falling back on the “reasonable expectation of privacy” standard. Instead, they should use body cameras to embrace more nuanced theories of privacy, such as trust and obscurity. Trust-based relationships can be used to counter the harshness of the third party doctrine. The value of obscurity reveals the misguided nature of the argument that there is “no privacy in public.” Law and policy makers can also better protect privacy by creating rules that address how body cameras and data technologies are designed in addition to how they are used. Since body-camera systems implicate every stage of the modern data life cycle from collection to disclosure, they can serve as a useful model across industry and government. But if law and policy makers hope to show how privacy rules can be improved, they must act quickly. The path to privacy law’s redemption will stay clear for only so long

    Website Design as Contract

    Get PDF
    Few website users actually read or rely upon terms of use or privacy policies. Yet users regularly take advantage of and rely upon website design features like privacy settings. To reconcile the disparity between boilerplate legalese and website design, this article develops a theory of website design as contract. The ability to choose privacy settings, un-tag photos, and delete information is part of the negotiation between websites and users regarding their privacy. Yet courts invariably recognize only the boilerplate terms when analyzing online agreements. In this article, I propose that if significant website features are incorporated into the terms of use, or if these features induce reliance, they should be considered enforceable promises. For example, the ability to increase privacy settings could be legitimized as an offer by the website to protect information. A website design could also render an agreement unconscionable if it manipulated, exploited, or confused a user. Finally, website design could serve as evidence of a subsequent agreement, or “operational reality,” between the parties. By providing a theory of website design as contract, this article shifts the focus of online agreements away from unread standard-form legalese to an approach that more accurately reflects the agreement between websites and users

    Unfair and Deceptive Robots

    Get PDF
    Robots, like household helpers, personal digital assistants, automated cars, and personal drones are or will soon be available to consumers. These robots raise common consumer protection issues, such as fraud, privacy, data security, and risks to health, physical safety and finances. Robots also raise new consumer protection issues, or at least call into question how existing consumer protection regimes might be applied to such emerging technologies. Yet it is unclear which legal regimes should govern these robots and what consumer protection rules for robots should look like. The thesis of the Article is that the FTC’s grant of authority and existing jurisprudence make it the preferable regulatory agency for protecting consumers who buy and interact with robots. The FTC has proven to be a capable regulator of communications, organizational procedures, and design, which are the three crucial concepts for safe consumer robots. Additionally, the structure and history of the FTC shows that the agency is capable of fostering new technologies as it did with the Internet. The agency generally defers to industry standards, avoids dramatic regulatory lurches, and cooperates with other agencies. Consumer robotics is an expansive field with great potential. A light but steady response by the FTC will allow the consumer robotics industry to thrive while preserving consumer trust and keeping consumers safe from harm

    Falling On Deaf Ears: Is the Fail-Safe Triennial Exemption Provision in the Digital Millennium Copyright Act Effective in Protecting Fair Use?

    Full text link
    This Article examines whether the fail-safe triennial exemption provision of the DMCA is effective for its intended purpose: to serve as a countermeasure to the DMCA\u27s anti-circumvention provisions by protecting the ability of the public to engage in non-infringing uses of copyrighted works. Ultimately, this Article concludes that there are too many faults in both the structure and the execution of the rule-making provision to meaningfully counteract the adverse effects of the anti-circumvention provisions of the DMCA. Specifically, the rule-making procedure explicitly prohibits exemptions to a class based on the use of the work. This amounts to a rejection of fair use principles - one of the very doctrines the exemption provision was designed to protect

    Obscurity by Design

    Get PDF
    Design-based solutions to confront technological privacy threats are becoming popular with regulators. However, these promising solutions have left the full potential of design untapped. With respect to online communication technologies, design-based solutions for privacy remain incomplete because they have yet to successfully address the trickiest aspect of the Internet—social interaction. This Article posits that privacy-protection strategies such as “Privacy by Design” face unique challenges with regard to social software and social technology due to their interactional nature. This Article proposes that design-based solutions for social technologies benefit from increased attention to user interaction, with a focus on the principles of “obscurity” rather than the expansive and vague concept of “privacy.” The main thesis of this Article is that obscurity is the optimal protection for most online social interactions and, as such, is a natural locus for design-based privacy solutions for social technologies. To that end, this Article develops a model of “obscurity by design” as a means to address the privacy problems inherent in social technologies and the Internet

    Breached!: Why Data Security Law Fails and How to Improve It

    Get PDF
    Digital connections permeate our lives—and so do data breaches. Given that we must be online for basic communication, finance, healthcare, and more, it is remarkable how difficult it is to secure our personal information. Despite the passage of many data security laws, data breaches are increasing at a record pace. In their book, BREACHED! WHY DATA SECURITY LAW FAILS AND HOW TO IMPROVE IT (Oxford University Press 2022), Professors Daniel Solove and Woodrow Hartzog argue that the law fails because, ironically, it focuses too much on the breach itself.Drawing insights from many fascinating stories about data breaches, Solove and Hartzog show how major breaches could have been prevented or mitigated through better rules and often inexpensive, non-cumbersome means. They also reveal why the current law is counterproductive. It pummels organizations that have suffered a breach but doesn’t recognize how others contribute to the breach. These outside actors include software companies that create vulnerable software, device companies that make insecure devices, government policymakers who write regulations that increase security risks, organizations that train people to engage in risky behaviors, and more.Although humans are the weakest link for data security, the law remains oblivious to the fact that policies and technologies are often designed with a poor understanding of human behavior. BREACHED! sets forth a holistic vision for data security law—one that holds all actors accountable, understands security broadly and in relationship to privacy, looks to prevention and mitigation rather than reaction, and is designed with people in mind. The book closes with a roadmap for how we can reboot law and policy surrounding data security.https://scholarship.law.bu.edu/books/1338/thumbnail.jp
    • …
    corecore