255 research outputs found

    When Law Frees Us to Speak

    Get PDF
    A central aim of online abuse is to silence victims. That effort is as regrettable as it is successful. In the face of cyberharassment and sexualprivacy invasions, women and marginalized groups retreat from online engagement. These documented chilling effects, however, are not inevitable. Beyond its deterrent function, the law has an equally important expressive role. In this Article, we highlight law’s capacity to shape social norms and behavior through education. We focus on a neglected dimension of law’s expressive role: its capacity to empower victims to express their truths and engage with others. Our argument is theoretical and empirical. We present new empirical research showing cyberharassment law’s salutary effects on women’s online expression. We then consider the implications of those findings for victims of sexual-privacy invasions

    Fulfilling Government 2.0\u27s Promise with Robust Privacy Protections

    Get PDF
    The public can now “friend” the White House and scores of agencies on social networks, virtual worlds, and video-sharing sites. The Obama Administration sees this trend as crucial to enhancing governmental transparency, public participation, and collaboration. As the President has underscored, government needs to tap into the public’s expertise because it doesn’t have all of the answers. To be sure, Government 2.0 might improve civic engagement. But it also might produce privacy vulnerabilities because agencies often gain access to individuals’ social network profiles, photographs, videos, and contact lists when interacting with individuals online. Little would prevent agencies from using and sharing individuals’ social media data for more than policymaking, including law enforcement, immigration, tax, and benefits matters. Although people may be prepared to share their views on health care and the environment with agencies and executive departments, they may be dismayed to learn that such policy collaborations carry a risk of government surveillance. This Essay argues that government should refrain from accessing individuals’ social media data on Government 2.0 sites. Agencies should treat these sites as one-way mirrors, where individuals can see government’s activities and engage in policy discussions but where government cannot use, collect, or distribute individuals’ social media information. A “one-way mirror” policy would facilitate democratic discourse, enhance government accountability, and protect privacy

    Civil Rights in the Cyber World

    Get PDF

    A Poor Mother\u27s Right to Privacy: A Review

    Get PDF
    Collecting personal data is a feature of daily life. Businesses, advertisers, agencies, and law enforcement amass massive reservoirs of our personal data. This state of affairs—what I am calling the “collection imperative”—is justified in the name of efficiency, convenience, and security. The unbridled collection of personal data, meanwhile, leads to abuses. Public and private entities have disproportionate power over individuals and groups whose information they have amassed. Nowhere is that power disparity more evident than for the state’s surveillance of the indigent. Poor mothers, in particular, have vanishingly little privacy. Whether or not poor mothers receive subsidized prenatal care, the existential state of poor mothers is persistent and indiscriminate state surveillance. Professor Khiara Bridges’s book, The Poverty of Privacy Rights, advances the project of securing privacy for the most vulnerable among us. It shows how the moral construction of poverty animates the state’s surveillance of poor mothers, rather than legitimate concerns about prenatal care. It argues that poor mothers have a constitutional right not to be known if the state’s data collection efforts demean and humiliate them for no good reason. The Poverty of Privacy Rights provides an important lens for rethinking the data collection imperative more generally. It supplies a theory not only on which a constitutional right to information privacy can be built but also on which positive law and norms can develop. Concepts of reciprocity may provide another analytical tool to understand a potential right to be as unknown to government as it is to us

    A New Compact for Sexual Privacy

    Get PDF
    Intimate life is under constant surveillance. Firms track people’s periods, hot flashes, abortions, sexual assaults, sex toy use, sexual fantasies, and nude photos. Individuals hardly appreciate the extent of the monitoring, and even if they did, little can be done to curtail it. What is big business for firms is a big risk for individuals. The handling of intimate data undermines the values that sexual privacy secures—autonomy, dignity, intimacy, and equality. It can imperil people’s job, housing, insurance, and other crucial opportunities. More often, women and minorities shoulder a disproportionate amount of the burden. Privacy law is failing us. Our consumer protection approach offers little protection. Not only is the private-sector’s handling of intimate information largely unrestrained, but it is treated as normative. This Article offers a new compact for the protection of intimate information. Fundamental civil rights and liberties, along with consumer protection, is at stake. The new compact seeks to stem the tidal wave of collection, restrict certain uses of intimate data, and expand the suite of remedies available to courts. It draws upon the lessons of civil rights law in moving beyond procedural protections and in authorizing injunctive relief, including orders to stop processing intimate data

    Privacy Injunctions

    Get PDF
    Violations of intimate privacy can be never ending. As long as nonconsensual pornography and deepfake sex videos remain online, privacy violations continue, as does the harm. This piece highlights the significance of injunctive relief to protect intimate privacy and legal reforms that can get us there. Injunctive relief is crucial for what it will say and do for victims and the groups to which they belong. It would have content platforms treat victims with the respect that they deserve, rather than as purveyors of their humiliation. It would say to victims that their intimate privacy matters and that sites specializing in intimate privacy violations are not lawless zones where their rights can be violated. For victims, the journey to reclaim their sexual and bodily autonomy, self-esteem and social esteem, and sense of physical safety proceeds slowly; the halting of the privacy violation lets that process begin. The crux of my proposal is straightforward: Lawmakers should empower courts to issue injunctive relief, directing content platforms that enable intimate privacy violations to remove, delete, or otherwise make unavailable intimate images, real or fake, that were hosted without written permission. They should amend Section 230 of the Communications Decency Act so that these enabling platforms can be sued for injunctive remedies. Market developments can fill some of the gaps as we wait for laws to protect intimate privacy as vigorously and completely as they should

    A New Compact for Sexual Privacy

    Get PDF
    Intimate life is under constant surveillance. Firms track people’s periods, hot flashes, abortions, sexual assaults, sex toy use, sexual fantasies, and nude photos. Individuals hardly appreciate the extent of the monitoring, and even if they did, little could be done to curtail it. What is big business for firms is a big risk for individuals. Corporate intimate surveillance undermines sexual privacy—the social norms that manage access to, and information about, human bodies, sex, sexuality, gender, and sexual and reproductive health. At stake is sexual autonomy, self-expression, dignity, intimacy, and equality. So are people’s jobs, housing, insurance, and other life opportunities. Women and minorities shoulder a disproportionate amount of that burden. Privacy law is failing us. Not only is the private sector’s handling of intimate information largely unrestrained by American consumer protection law, but it is treated as inevitable and valuable. This Article offers a new compact for sexual privacy. Reform efforts should focus on stemming the tidal wave of collection, restricting uses of intimate data, and expanding the remedies available in court to include orders to stop processing intimate data

    Extremist Speech, Compelled Conformity, and Censorship Creep

    Get PDF
    Silicon Valley has long been viewed as a full-throated champion of First Amendment values. The dominant online platforms, however, have recently adopted speech policies and processes that depart from the U.S. model. In an agreement with the European Commission, the dominant tech companies have pledged to respond to reports of hate speech within twenty-four hours, a hasty process that may trade valuable expression for speedy results. Plans have been announced for an industry database that will allow the same companies to share hashed images of banned extremist content for review and removal elsewhere. These changes are less the result of voluntary market choices than of a bowing to governmental pressure. Companies’ policies about extremist content have been altered to stave off threatened European regulation. Far more than illegal hate speech or violent terrorist imagery is in EU lawmakers’ sights, so too is online radicalization and “fake news.” Newsworthy content and political criticism may end up being removed along with terrorist beheading videos, “kill lists” of U.S. servicemen, and instructions on how to bomb houses of worship. The impact of extralegal coercion will be far reaching. Unlike national laws that are limited by geographic borders, terms-of-service agreements apply to platforms’ services on a global scale. Whereas local courts can order platforms only to block material viewed in their jurisdictions, a blacklist database raises the risk of global censorship. Companies should counter the serious potential for censorship creep with definitional clarity, robust accountability, detailed transparency, and ombudsman oversight

    Mainstreaming Privacy Torts

    Get PDF
    In 1890, Samuel Warren and Louis Brandeis proposed a privacy tort and seventy years later, William Prosser conceived it as four wrongs. In both eras, privacy invasions primarily caused psychic and reputational wounds of a particular sort. Courts insisted upon significant proof due to those injuries’ alleged ethereal nature. Digital networks alter this calculus by exacerbating the injuries inflicted. Because humiliating personal information posted online has no expiration date, neither does individual suffering. Leaking databases of personal information and postings that encourage assaults invade privacy in ways that exact significant financial and physical harm. This dispels concerns that plaintiffs might recover for trivialities. Unfortunately, privacy tort law is ill-equipped to address these changes. Prosser built the modern privacy torts based on precedent and a desire to redress harm. Although Prosser’s privacy taxonomy succeeded in the courts because it blended theory and practice, it conceptually narrowed the interest that privacy tort law sought to protect. Whereas Warren and Brandeis conceived privacy tort law as protecting a person’s right to develop his “inviolate personality” free from unwanted publicity and access by others, Prosser saw it as addressing specific emotional, reputational, and proprietary injuries caused by four kinds of activities prevalent in the twentieth century. Courts have too often rigidly interpreted the four privacy torts, further confining their reach. As a result, Prosser’s privacy taxonomy often cannot address the privacy interests implicated by networked technologies. The solution lies in taking the best of what Prosser had to offer – his method of borrowing from doctrine and focusing on injury prevention and remedy – while ensuring that proposed solutions are transitional and dynamic. Any updates to privacy tort law should protect the broader set of interests identified by Warren and Brandeis, notably a person’s right to be free from unwanted disclosures of personal information so that he can develop his personality. While leaking databases and certain online postings compromise that interest, we should invoke mainstream tort remedies to address them, rather than conceiving unattainable new privacy torts. In addition to supplementing privacy tort law with traditional tort claims, courts should consider the ways that the internet magnifies privacy harms to ensure law’s recognition of them

    Extremist Speech, Compelled Conformity, and Censorship Creep

    Get PDF
    Silicon Valley has long been viewed as a full-throated champion of First Amendment values. The dominant online platforms, however, have recently adopted speech policies and processes that depart from the U.S. model. In an agreement with the European Commission, tech companies have pledged to respond to reports of hate speech within twenty-four hours, a hasty process that may trade valuable expression for speedy results. Plans have been announced for an industry database that will allow the same companies to share hashed images of banned extremist content for review and removal elsewhere. These changes are less the result of voluntary market choices than a bowing to governmental pressure. Private speech rules and policies about extremist content have been altered to stave off threatened European regulation. Far more than illegal hate speech or violent terrorist imagery is in EU lawmakers’ sights, so too is online radicalization and “fake news.” Newsworthy content may end up being removed along with terrorist beheading videos, “kill lists” of U.S. servicemen, and instructions on how to blow up houses of worship. The impact of extralegal coercion will be far reaching. Unlike national laws that are limited by geographic borders, terms-of-service agreements apply to platforms’ services on a global scale. Whereas local courts can only order platforms to block material viewed in their jurisdictions, a blacklist database raises the risk of total censorship. Companies should counter the serious potential for censorship creep with definitional clarity, robust accountability, detailed transparency, and ombudsman oversight
    • …
    corecore