48 research outputs found

    Architecture and philosophy : reflections on Arakawa and Gins

    Get PDF
    <p>This essay is a critical review of a recent Arakawa and Gins conference – an event that brought phenomenology and architecture into productive dialog through the advancement of interdisciplinary inquiry into the subtle and complex ways that embodied activity structures cognition and perception. To provide the reader with sufficient context to appreciate the ensuing discussion of Arakawa and Gins’s concepts and hypotheses, we open with an overview of their previous collaborations. We then transition to analysis of a unique installation called 'Reading Room', and, immediately afterwards, provide exegetical commentary on select conference presentations. This commentary emphasizes phenomenological perspectives, especially ideas that Don Ihde and Shaun Gallagher conveyed. We conclude by outlining some of the most promising horizons of thought that the conference brought to our consideration.</p

    The Internet of Heirlooms and Disposable Things

    Get PDF
    The Internet of Things (“IoT”) is here, and we seem to be going all in. We are trying to put a microchip in nearly every object that is not nailed down and even a few that are. Soon, your cars, toasters, toys, and even your underwear will be wired up to make your lives better. The general thought seems to be that “Internet connectivity makes good objects great.” While the IoT might be incredibly useful, we should proceed carefully. Objects are not necessarily better simply because they are connected to the Internet. Often, the Internet can make objects worse and users worse-off. Digital technologies can be hacked. Each new camera, microphone, and sensor adds another vector for attack and another point of surveillance in our everyday lives. The problem is that privacy and data security law have failed to recognize some “things” are more dangerous than others as part of the IoT. Some objects, like coffee pots and dolls, can last long after the standard life-cycle of software. Meanwhile cheap, disposable objects, like baby wipes, might not be worth outfitting with the most secure hardware and software. Yet they all are part of the network. This essay argues that the nature of the “thing” in the IoT should play a more prominent role in privacy and data security law. The decision to wire up an object should be coupled with responsibilities to make sure its users are protected. Only then, can we trust the Internet of Heirlooms and Disposable Things

    Surveillance as Loss of Obscurity

    Full text link
    Everyone seems concerned about government surveillance, yet we have a hard time agreeing when and why it is a problem and what we should do about it. When is surveillance in public unjustified? Does metadata raise privacy concerns? Should encrypted devices have a backdoor for law enforcement officials? Despite increased attention, surveillance jurisprudence and theory still struggle for coherence. A common thread for modern surveillance problems has been difficult to find. In this article we argue that the concept of ‘obscurity,’ which deals with the transaction costs involved in finding or understanding information, is the key to understanding and uniting modern debates about government surveillance. Obscurity can illuminate different areas where transactions costs for surveillance are operative and explain why making surveillance hard but possible is the central issue in the government-surveillance debates. Obscurity can also explain why the solutions to the government-surveillance problem should revolve around introducing friction and inefficiency into process, whether it be legally through procedural requirements like warrants or technologies like robust encryption. Ultimately, obscurity can provide a clearer picture of why and when government surveillance is troubling. It provides a common thread for disparate surveillance theories and can be used to direct surveillance reform

    Eye-Tracking in Virtual Reality: A Visceral Notice Approach for Protecting Privacy

    Get PDF
    Eye-tracking is in our future. Across many fields, eye-tracking is growing in prominence. This paper focuses on eye-tracking in virtual reality as a case study to illuminate novel privacy risks and propose a governance response to them: a design shift that provides users with an experientially resonant means of understanding privacy threats. It is a strategy that Ryan Calo calls “visceral notice.” To make our case for visceral notice, we proceed as follows. First, we provide a concise account of how eye-tracking works, emphasizing its threat to autonomy and privacy.&nbsp; Second, we discuss the sensitive personal information that eye-tracking reveals, complications that limit what eye-tracking studies establish, and the comparative advantage large technology companies may have when tracking our eyes.&nbsp; Third, we explain why eye-tracking will likely be crucial for developing virtual reality technology. Fourth, we review Calo’s conception of visceral notice and offer suggestions for applying it to virtual reality to help users better appreciate eye-tracking risks. Finally, we consider seven objections to our proposals and provide counterpoints to them.&nbsp;&nbsp

    Public Philosophy of Technology

    Get PDF
    Philosophers of technology are not playing the public role which our own theoretical perspectives motivate us to take. A great variety of theories and perspectives within philosophy of technology, including those of Marcuse, Feenberg, Borgmann, Ihde, Michelfelder, Bush, Winner, Latour, and Verbeek, either support or directly call for various sorts of intervention—a call that we have failed to adequately heed. Barriers to such intervention are discussed, and three proposals for reform are advanced: (1) post-publication peer-reviewed reprinting of public philosophy, (2) increased emphasis on true open access publication, and (3) increased efforts to publicize and adapt traditional academic research

    The Inconsentability of Facial Surveillance

    Get PDF
    Governments and companies often use consent to justify the use of facial recognition technologies for surveillance. Many proposals for regulating facial recognition technology incorporate consent rules as a way to protect those faces that are being tagged and tracked. But consent is a broken regulatory mechanism for facial surveillance. The individual risks of facial surveillance are impossibly opaque, and our collective autonomy and obscurity interests aren’t captured or served by individual decisions.In this article, we argue that facial recognition technologies have a massive and likely fatal consent problem. We reconstruct some of Nancy Kim’s fundamental claims in Consentability: Consent and Its Limits, emphasizing how her consentability framework grants foundational priority to individual and social autonomy, integrates empirical insights into cognitive limitations that significantly impact the quality of human decision-making when granting consent, and identifies social, psychological, and legal impediments that allow the pace and negative consequences of innovation to outstrip the protections of legal regulation.We also expand upon Kim’s analysis by arguing that valid consent cannot be given for face surveillance. Even if valid individual consent to face surveillance was possible, permission for such surveillance is in irresolvable conflict with our collective autonomy and obscurity interests. Additionally, there is good reason to be skeptical of consent as the justification for any use of facial recognition technology, including facial characterization, verification, and identification

    Increasing the Transaction Costs of Harassment

    Get PDF
    Wouldn’t it be nice if the rules, agreements, and guidelines designed to prevent online harassment were sufficient to curb improper behavior? As if. Wrongdoers are not always so easily deterred. Sometimes these approaches are about as effective as attacking tanks with toothpicks. As Danielle Citron contends in her critically important work, Hate Crimes in Cyberspace, the design of the Internet facilitates vitriol and abuse, even when it is legally, contractually, and normatively prohibited. Communicating almost effortlessly at distance—sometimes anonymously and typically with minimized body language—can heighten emotional detachment and blunt moral sensitivity. Tragically, when a mediated environment makes it easy to harass others, harassment occurs, all things being equal

    Poverty Tourism, Justice and Policy

    Get PDF
    Based on moral grounds, should poverty tourism be subject to specific policy constraints? This article responds by testing poverty tourism against the ethical guideposts of compensation justice, participative justice, and recognition justice, and two case descriptions, favela tours in Rocinha and garbage dump tours in Mazatlan. The argument advanced is that the complexity of the social relationships involved those tours requires policy-relevant research and solutions

    Privacy Nicks: How the Law Normalizes Surveillance

    Get PDF
    Privacy law is failing to protect individuals from being watched and exposed, despite stronger surveillance and data protection rules. The problem is that our rules look to social norms to set thresholds for privacy violations, but people can get used to being observed. In this article, we argue that by ignoring de minimis privacy encroachments, the law is complicit in normalizing surveillance. Privacy law helps acclimate people to being watched by ignoring smaller, more frequent, and more mundane privacy diminutions. We call these reductions “privacy nicks,” like the proverbial “thousand cuts” that lead to death.Privacy nicks come from the proliferation of cameras and biometric sensors on doorbells, glasses, and watches, and the drift of surveillance and data analytics into new areas of our lives like travel, exercise, and social gatherings. Under our theory of privacy nicks as the Achilles heel of surveillance law, invasive practices become routine through repeated exposures that acclimate us to being vulnerable and watched in increasingly intimate ways. With acclimation comes resignation, and this shift in attitude biases how citizens and lawmakers view reasonable measures and fair tradeoffs.Because the law looks to norms and people’s expectations to set thresholds for what counts as a privacy violation, the normalization of these nicks results in a constant re-negotiation of privacy standards to society’s disadvantage. When this happens, the legal and social threshold for rejecting invasive new practices keeps getting redrawn, excusing ever more aggressive intrusions. In effect, the test of what privacy law allows is whatever people will tolerate. There is no rule to stop us from tolerating everything. This article provides a new theory and terminology to understand where privacy law falls short and suggests a way to escape the current surveillance spiral

    The Profiling Potential of Computer Vision and the Challenge of Computational Empiricism

    Full text link
    Computer vision and other biometrics data science applications have commenced a new project of profiling people. Rather than using 'transaction generated information', these systems measure the 'real world' and produce an assessment of the 'world state' - in this case an assessment of some individual trait. Instead of using proxies or scores to evaluate people, they increasingly deploy a logic of revealing the truth about reality and the people within it. While these profiling knowledge claims are sometimes tentative, they increasingly suggest that only through computation can these excesses of reality be captured and understood. This article explores the bases of those claims in the systems of measurement, representation, and classification deployed in computer vision. It asks if there is something new in this type of knowledge claim, sketches an account of a new form of computational empiricism being operationalised, and questions what kind of human subject is being constructed by these technological systems and practices. Finally, the article explores legal mechanisms for contesting the emergence of computational empiricism as the dominant knowledge platform for understanding the world and the people within it
    corecore