67 research outputs found

    Not in My Atlantic Yards: Examining Netroots’ Role in Eminent Domain Reform

    Get PDF
    (Excerpt) Since the Supreme Court\u27s decision in Kelo v. City of New London, which expanded the state\u27s power to condemn private property and transfer it to other private owners under the Fifth Amendment, there have been significant calls to curb the power of eminent domain through statutory reform. Scholars and jurists in favor of eminent domain reform have asserted that legislation is needed to protect private property rights against the rising tide of state power, with many arguing that such reform should incorporate a public approval process into land use decisions. Those opposed to eminent-domain reform argue that empowering the public in land use decisions is an imperfect process that slows development. This Note asserts that incorporating public approval into the eminent-domain process need not come at the cost of expediency. Rather, thanks to advances in technology, a public empowered by statutory reform can couple with grassroots Internet political activism—a concept popularly dubbed as netroots —to create a new and more efficient approach to traditionally ineffective public forums. This Note uses the Atlantic Yards project as a case study in post-Kelo use of eminent domain. Part I will outline the role of Kelo in reshaping the debate around eminent domain. Part II will examine the history and controversy surrounding Atlantic Yards and illustrate how, despite significant Internet-facilitated community activism, the absence of a legal mechanism prevented landowners from affecting any change in the outcome of Forest City Ratner\u27s $4.9 billion commercial and residential development plan in Brooklyn, New York. Finally, Part III will look at traditional methods for public forums in land use and propose a new format to elicit and accommodate public participation in land use decisions. It will argue that advances in technology and the proliferation of the Internet have increased community connectivity, involvement, and transparency and can be used to streamline the public-hearing process. Statutory reform that incorporates these advances can appease both sides of the eminent domain reform debate and create a more efficient and democratic system of land use

    Re-Shaming the Debate: Social Norms, Shame, and Regulation in an Internet Age

    Get PDF
    Advances in technological communication have dramatically changed the ways in which social norm enforcement is used to constrain behavior. Nowhere is this more powerfully demonstrated than through current events around online shaming and cyber harassment. Low cost, anonymous, instant, and ubiquitous access to the Internet has removed most—if not all—of the natural checks on shaming. The result is norm enforcement that is indeterminate, uncalibrated, and often tips into behavior punishable in its own right—thus generating a debate over whether the state should intervene to curb online shaming and cyber harassment. A few years before this change in technology, a group of legal scholars debated just the opposite, discussing the value of harnessing the power of social norm enforcement through shaming by using state shaming sanctions as a more efficient means of criminal punishment. Though the idea was discarded, many of their concerns were prescient and can inform today’s inverted new inquiry: whether the state should create limits on shaming and cyber bullying. Perhaps more importantly, the debate reintroduces the notion of thinking of shaming within the framework of social norm enforcement, thus clarifying the taxonomy of online shaming, cyber bullying, and cyber harassment. This Article ties together the current conversation around online shaming, cyber bullying, and cyber harassment with the larger legal discussion on social norms and shaming sanctions. It argues that the introduction of the Internet has altered the social conditions in which people speak and thus changed the way we perceive and enforce social norms. Accordingly, online shaming is (1) an over-determined punishment with indeterminate social meaning; (2) not a calibrated or measured form of punishment; and (3) of little or questionable accuracy in who and what it punishes

    A New Taxonomy for Online Harms

    Get PDF
    (Excerpt) Bullying is generally understood among academics and educators as having to meet three criteria: (1) it must be verbal or physical aggression; (2) it must be repeated over time; and (3) it must involve a power differential. When talking about cyber bullying, the aggression is mostly verbal, using “threats, blackmail. . . gossip and rumors” and online personas or messages can be more cruel, vindictive and mean. Though cyber bullying typically describes acts between children, the same acts by adults could also be considered cyber harassment. Unlike harassment, however, bullying does not have a history of criminal liability—though all 50 states have now passed anti-bullying legislation, such laws did not exist before 1999. But what about online harms that don’t fall into the definitions of cyber harassment or cyber bullying? How do you characterize the story of Walter Palmer, the mid-Western dentist vilified on- and offline for killing a lion on a hunting trip to Africa? Or Justine Sacco, the young woman whose racist Tweet about AIDS triggered viral worldwide outrage? Or Gene Cooley, the man run out of his small town in Georgia by anonymous and untruthful postings on an Internet message board

    The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression

    Get PDF
    For a decade and a half, Facebook has dominated the landscape of digital social networks, becoming one of the most powerful arbiters of online speech. Twenty-four hours a day, seven days a week, over two billion users leverage the platform to post, share, discuss, react to, and access content from all over the globe. Through a system of semipublic rules called “Community Standards,” Facebook has created a body of “laws” and a system of governance that dictate what users may say on the platform. In recent years, as this intricately built system to dispatch the company’s immense private power over the public right of speech has become more visible, Facebook has experienced intense pressure to become more accountable, transparent, and democratic, not only in how it creates its fundamental policies for speech but also in how it enforces them. In November 2018, after years of entreaty from the press, advocacy groups, and users, CEO and founder Mark Zuckerberg announced that Facebook would construct an independent oversight body to be researched, created, and launched within the year. The express purpose of this body was to serve as an appellate review system for user content and to make content-moderation policy recommendations to Facebook. This Feature empirically documents the creation of this institution, now called the Facebook Oversight Board. The Board is a historic endeavor both in scope and scale. The Feature traces the events and influences that led to Facebook’s decision to create the Oversight Board. It details the year-long process of creating the Board, relying on hundreds of hours of interviews and embedded research with the Governance Team charged with researching, planning, and building this new institution. The creation of the Oversight Board and its aims are a novel articulation of internet governance. This Feature illuminates the future implications of the new institution for global freedom of expression. Using the lens of adjudication, it analyzes what the Board is, what the Board means to users, and what the Board means for industry and governments. Ultimately, the Feature concludes that the Facebook Oversight Board has great potential to set new precedent for user participation in private platforms’ governance and a user right to procedure in content moderation

    Of Systems Thinking and Straw Men

    Get PDF
    (Excerpt) In Content Moderation as Systems Thinking, Professor Evelyn Douek, as the title suggests, endorses an approach to the people, rules, and processes governing online speech as one not of anecdote and doctrine but of systems thinking. She constructs this concept as a novel and superior understanding of the problems of online-speech governance as compared to those existent in what she calls the “standard [scholarly] picture of content moderation.” This standard picture of content moderation — which is roughly five years old — is “outdated and incomplete,” she argues. It is preoccupied with anecdotal, high-profile adjudications in which platforms make the right or wrong decision to take down certain speech and not focused enough on the platform’s design choices and invisible automated removal of content. It draws too heavily from First Amendment contexts, which leads to platforms assessing content moderation controversies as if they were individual judicial cases. Douek calls her approach “both ambitious and modest.” The modest part calls for structural and procedural regulatory reforms that center content moderation as “systems thinking.” The notion of systems thinking conveys a generalized approach of framing complexity as a whole comprised of dynamic relationships rather than the sum of segmented parts. The ambitious part is dismantling the standard picture of content moderation scholarship and challenging the resultant “accountability theater” created by platforms and lawmakers alike. In Douek’s view, it is this “stylized picture of content moderation” that is to blame for regulators assuming “that the primary way they can make social media platforms more publicly accountable is by requiring them to grant users ever more individual procedural rights.” There is much to like about understanding content moderation as a complex, dynamic, and ever-evolving system. Particularly useful for an article titled Content Moderation as Systems Thinking that calls for regulation of technology, there is rich and detailed scholarship on content moderation in both sociotechnical theory and the law. Indeed, most of the academic work on content moderation is done by sociotechnical theory scholars who study content moderation and platform governance using systems-thinking and systems-theory frameworks. Sociotechnical systems theory posits that an organization is best understood and improved if all parts of the system — people, procedures, norms, culture, technology, infrastructure, and outcomes — are understood as relational and interdependent parts of a complex system. In analyzing private law under this theoretical framework, Professor Henry Smith describes systems as “a collection of elements and — crucially — the connections between and among them; complex systems are ones in which the properties of the system as a whole are difficult to infer from the properties of the parts.” Examples of systems abound at all levels of nature and society: from cognition to social networks or economies, or as Smith proposes, systems of law. Systems thinking, then, according to those that study it, is one step removed: “literally, a system of thinking about systems.” This definition is, of course, tautological; even the authors of the only article Douek cites on the topic seem confused. But the takeaway of “systems thinking” is much the same as that described by sociotechnical theory and by Smith: an “understanding of dynamic behavior, systems structure as a cause of that behavior, and the idea of seeing systems as wholes rather than parts” — wholes that create “emergent properties” whose origins cannot be traced to any one part or interplay of the system. It is both the ocean and the wave, the forest and the trees, as well as all of the interactions and the emergent properties resultant

    Facebook v. Sullivan: Public Figures and Newsworthiness in Online Speech

    Get PDF
    In the United States, there are now two systems to adjudicate disputes about harmful speech. The first is older and more established: the legal system in which judges apply constitutional law to limit tort claims alleging injuries caused by speech. The second is newer and less familiar: the content-moderation system in which platforms like Facebook implement the rules that govern online speech. These platforms are not bound by the First Amendment. But, as it turns out, they rely on many of the tools used by courts to resolve tensions between regulating harmful speech and preserving free expression—particularly the entangled concepts of “public figures” and “newsworthiness.” This Article offers the first empirical analysis of how judges and content moderators have used these two concepts to shape the boundaries of free speech. It first introduces the legal doctrines developed by the “Old Governors,” exploring how courts have shaped the constitutional concepts of public figures and newsworthiness in the face of tort claims for defamation, invasion of privacy, and intentional infliction of emotional distress. The Article then turns to the “New Governors” and examines how Facebook’s content-moderation system channeled elements of the courts’ reasoning for imposing First Amendment limits on tort liability. By exposing the similarities and differences between how the two systems have understood these concepts, this Article offers lessons for both courts and platforms as they confront new challenges posed by online speech. It exposes the pitfalls of using algorithms to identify public figures; explores the diminished utility of setting rules based on voluntary involvement in public debate; and analyzes the dangers of ad hoc and unaccountable newsworthiness determinations. Both courts and platforms must adapt to the new speech ecosystem that companies like Facebook have helped create, particularly the way that viral content has shifted normative intuitions about who deserves harsher rules in disputes about harmful speech, be it in law or content moderation. Finally, the Article concludes by exploring what this comparison reveals about the structural role platforms play in today’s speech ecosystem and how it illuminates new solutions. These platforms act as legislature, executive, judiciary, and press—but without any separation of powers to establish checks and balances. A change to this model is already occurring at one platform: Facebook is creating a new Oversight Board that will hopefully provide due process to users on the platform’s speech decisions and transparency about how content-moderation policy is made, including how concepts related to newsworthiness and public figures are applied

    How to Make Facebook\u27s \u27Supreme Court\u27 Work

    Full text link
    The idea of a body that will decide what kind of content is allowed on the site is promising — but only if it’s done right

    The Law of Facebook: Borders, Regulation and Global Social Media

    Get PDF
    This paper provides an outline of the talks presented at the webinar event “The Law of Facebook: Borders, Regulation and Global Social Media” on 15 May 2020, jointly hosted by the City Law School Jean Monnet Chair of Law & Transatlantic Relations, the Institute for the study of European Law (ISEL) and the International Law and Affairs Group (ILAG)

    Making Mobility: Folding Tricycle Attachment for Standard Wheelchairs in Tanzania

    Full text link
    Final report and team photo for Project 19 of ME450, Winter 2009 semester.Thousands of Tanzanians are unable to attend school, work, or participate in the community due to an immobilizing disability. Those lucky enough to have a wheelchair are frustrated by long trips across rough terrain. Hand-powered tricycles are easier for long trips but are too big to use inside or on a bus. The goal of this project is to produce a low-cost, foldable, and stowable tricycle attachment for standard wheelchairs in collaboration with an MIT course. This hybrid design will allow users to travel freely between regions and more effectively function within their community.Kathleen Sienko (Mechanical Engineering, U of M); Mentor: Amos Winter (MIT)http://deepblue.lib.umich.edu/bitstream/2027.42/62473/2/ME450 Winter2009 Team Photo - Project 19 - Folding Tricycle Attachment for Wheelchairs.JPGhttp://deepblue.lib.umich.edu/bitstream/2027.42/62473/1/ME450 Winter2009 Final Report - Project 19 - Folding Tricycle Attachment for Wheelchairs.pd

    Facts and Where to Find Them: Empirical Research on Internet Platforms and Content Moderation

    Get PDF
    • …
    corecore