90 research outputs found

    Marriage Rights and the Good Life: A Sociological Theory of Marriage and Constitutional Law

    Get PDF
    This is the first in a series of three Articles investigating the underappreciated role that the social theory of Emile Durkheim plays in the quest for the freedom to marry for gay Americans. To that end, this Article begins the discussion by examining the Durkheimian legal arguments that go unnoticed in equal protection and due process claims against marriage discrimination. This Article challenges two assumptions: first, that the most effective legal argument for marriage rights is a purely liberal one, and second, that the substance and rhetoric of liberal toleration cannot exist symbiotically in the marriage discrimination debate with a more robust politics based on the experiential social value of marriage and gay relationships. The freedom to marry is both a liberal right and a piece of the good life. Drawing on Durkheim, this Article discusses a sociological theory of marriage and argues that the constitutional case for the freedom to marry is not just about the rights of equal protection and due process, but also about the sociology of marriage. In other words, a successful constitutional argument depends on the recognition that marriage is a social good with both general and everyday demonstrable benefits for the married couple and society as a whole

    Designing Without Privacy

    Get PDF
    In Privacy on the Ground, the law and information scholars Kenneth Bamberger and Deirdre Mulligan showed that empowered chief privacy officers (CPOs) are pushing their companies to take consumer privacy seriously, integrating privacy into the designs of new technologies. But their work was just the beginning of a larger research agenda. CPOs may set policies at the top, but they alone cannot embed robust privacy norms into the corporate ethos, practice, and routine. As such, if we want the mobile apps, websites, robots, and smart devices we use to respect our privacy, we need to institutionalize privacy throughout the corporations that make them. In particular, privacy must be a priority among those actually doing the work of design on the ground — namely, engineers, computer programmers, and other technologists. This Article presents findings from an ethnographic study of how, if at all, technologists doing the work of technology product design think about privacy, integrate privacy into their work, and consider user needs in the design process. It also looks at how attorneys at private firms draft privacy notices for their clients. Based on these findings, this Article presents a narrative running in parallel to the one described by Bamberger and Mulligan. This alternative account, where privacy is narrow, limited, and barely factoring into design, helps explain why so many products seem to ignore our privacy expectations. The Article then proposes a framework for understanding how factors both exogenous (theory and law) and endogenous (corporate structure and individual cognitive frames and experience) to the corporation prevent the CPOs’ robust privacy norms from diffusing throughout technology companies and the industry as a whole. This framework also helps elucidate how reforms at every level — theory, law, organization, and individual experience — can incentivize companies to take privacy seriously, enhance organizational learning, and eliminate the cognitive biases that lead to discrimination in design

    Safe Social Spaces

    Get PDF
    Technologies that mediate social interaction can put our privacy and our safety at risk. Harassment, intimate partner violence and surveillance, data insecurity, and revenge porn are just a few of the harms that bedevil technosocial spaces and their users, particularly users from marginalized communities. This Article seeks to identify the building blocks of safe social spaces, or environments in which individuals can share personal information at low risk of privacy threats. Relying on analogies to offline social spaces—Alcoholics Anonymous meetings, teams of coworkers, and attorney-client relationships—this Article argues that if a social space is defined as an environment characterized by disclosure, then a safe social space is one in which disclosure norms are counterbalanced by equally as powerful norms of trust that are both endogenously designed in and backed exogenously by law. Case studies of online social networks and social robots are used to show how both the design and law governing technosocial spaces today not only do not support trust, but actively undermine user safety by eroding trust and limiting the law’s regulatory power. The Article concludes with both design and law reform proposals to better build and protect trust and safe social spaces

    Policing Queer Sexuality

    Get PDF
    A Review of Vice Patrol: Cops, Courts, and the Struggle over Urban Gay Life Before Stonewall. By Anna Lvovsky

    Privacy Law’s False Promise

    Get PDF
    Privacy laws have never seemed stronger. New international, national, state, and local laws have been passed with the promise of greater protection for consumers. Courts across the globe are reclaiming the law’s power to limit collection of our data. And yet, our privacy seems more in danger now than ever, with frequent admissions of nefarious data use practices from social media, mobile apps, and e-commerce websites, among others. Why are privacy laws, seemingly more comprehensive than ever, not working to protect our privacy? This Article explains. Based on original primary source research—interviews with engineers, privacy professionals, and vendor executives; product demonstrations; webinars, blogs, industry literature; and more—this Article argues that privacy law is failing to deliver its promised protections because it is undergoing a process of legal endogeneity: mere symbols of compliance are standing in for real privacy protections. Toothless trainings, audits, and paper trails, among other symbols, are being confused for actual adherence to privacy law, which has the effect of undermining the promise of greater privacy protection for consumers

    Privacy, Sharing, and Trust: The Facebook Study

    Get PDF
    Using sharing on Facebook as a case study, this Article presents empirical evidence suggesting that trust is a significant factor in individuals’ willingness to share personal information on online social networks. I then make two arguments, one that explains why Facebook is designed the way it is and one that calls for legal protection against unfair manipulation of users. I argue that Facebook is built on trust: the trust that exists between friends and the trust that exists between users and the platform. In particular, I describe how Facebook designs its platform and interface to leverage the trust we have in our friends to nudge us to share. Sometimes, that helps create a dynamic social environment: knowing what our friends are doing helps us determine when it is safe to interact. Other times, Facebook leverages trust to manipulate us into sharing information with advertisers. This should give us pause. Because Facebook uses trust-based design, users may be confused about the privacy effects of their behavior. Federal and state consumer and privacy protection regulators should step in

    Privacy\u27s Rights Trap

    Get PDF

    Power, Process, and Automated Decision-Making

    Get PDF
    Automated decision-making systems based on “big data”–powered algorithms and machine learning are just as prone to mistakes, biases, and arbitrariness as their human counterparts. The result is a technologically driven decision-making process that seems to defy interrogation, analysis, and accountability and, therefore, undermines due process
    • …
    corecore