68,255 research outputs found

    Dynamic privacy in public surveillance

    Get PDF
    In implementing privacy protection in surveillance systems, designers must maximize privacy while retaining the system?s purpose. One way to achieve this is to combine data-hiding techniques with context-aware policies governing access to securely collected and stored data

    Context aware privacy in visual surveillance

    Full text link
    In this paper we present preliminary work implementing dynamic privacy in public surveillance. The aim is to maximise the privacy of those under surveillance, while giving an observer access to sufficient information to perform their duties. As these aspects are in conflict, a dynamic approach to privacy is required to balance the system\u27s purpose with the system\u27s privacy. Dynamic privacy is achieved by accounting for the situation, or context, within the environment. The context is determined by a number of visual features that are combined and then used to determine an appropriate level of privacy.<br /

    Regulating Mass Surveillance as Privacy Pollution: Learning from Environmental Impact Statements

    Get PDF
    Encroachments on privacy through mass surveillance greatly resemble the pollution crisis in that they can be understood as imposing an externality on the surveilled. This Article argues that this resemblance also suggests a solution: requiring those conducting mass surveillance in and through public spaces to disclose their plans publicly via an updated form of environmental impact statement, thus requiring an impact analysis and triggering a more informed public conversation about privacy. The Article first explains how mass surveillance is polluting public privacy and surveys the limited and inadequate doctrinal tools available to respond to mass surveillance technologies. Then, it provides a quick summary of the Privacy Impact Notices ( PINs ) proposal to make a case in principle for the utility and validity of PINs. Next, the Article explains how environmental law responded to a similar set problems (taking the form of physical harms to the environment) with the National Environmental Policy Act of 1969 ( NEPA ), requiring Environmental Impact Statement ( EIS ) requirements for environmentally sensitive projects. Given the limitations of the current federal privacy impact analysis requirement, the Article offers an initial sketch of what a PIN proposal would cover and its application to classic public spaces, as well as virtual spaces such as Facebook and Twitter. The Article also proposes that PINs apply to private and public data collection -including the NSA\u27s surveillance of communications. By recasting privacy harms as a form of pollution and invoking a familiar (if not entirely uncontroversial) domestic regulatory solution either directly or by analogy, the PINs proposal seeks to present a domesticated form of regulation with the potential to ignite a regulatory dynamic by collecting information about the privacy costs of previously unregulated activities that should, in the end, lead to significant results without running afoul of potential U.S. constitutional limits that may constrain data retention and use policies. Finally, the Article addresses three counterarguments focusing on the First Amendment right to data collection, the inadequacy of EISs, and the supposed worthlessness of notice-based regimes

    Regulating Mass Surveillance as Privacy Pollution: Learning from Environmental Impact Statements

    Get PDF
    Encroachments on privacy through mass surveillance greatly resemble the pollution crisis in that they can be understood as imposing an externality on the surveilled. This Article argues that this resemblance also suggests a solution: requiring those conducting mass surveillance in and through public spaces to disclose their plans publicly via an updated form of environmental impact statement, thus requiring an impact analysis and triggering a more informed public conversation about privacy. The Article first explains how mass surveillance is polluting public privacy and surveys the limited and inadequate doctrinal tools available to respond to mass surveillance technologies. Then, it provides a quick summary of the Privacy Impact Notices ( PINs ) proposal to make a case in principle for the utility and validity of PINs. Next, the Article explains how environmental law responded to a similar set problems (taking the form of physical harms to the environment) with the National Environmental Policy Act of 1969 ( NEPA ), requiring Environmental Impact Statement ( EIS ) requirements for environmentally sensitive projects. Given the limitations of the current federal privacy impact analysis requirement, the Article offers an initial sketch of what a PIN proposal would cover and its application to classic public spaces, as well as virtual spaces such as Facebook and Twitter. The Article also proposes that PINs apply to private and public data collection -including the NSA\u27s surveillance of communications. By recasting privacy harms as a form of pollution and invoking a familiar (if not entirely uncontroversial) domestic regulatory solution either directly or by analogy, the PINs proposal seeks to present a domesticated form of regulation with the potential to ignite a regulatory dynamic by collecting information about the privacy costs of previously unregulated activities that should, in the end, lead to significant results without running afoul of potential U.S. constitutional limits that may constrain data retention and use policies. Finally, the Article addresses three counterarguments focusing on the First Amendment right to data collection, the inadequacy of EISs, and the supposed worthlessness of notice-based regimes

    Dynamic privacy in a smart house environment

    Full text link
    A smart house can be regarded as a surveillance environment in which the person being observed carries out activities that range from intimate to more public. What can be observed depends on the activity, the person observing (e.g. a carer) and policy. In assisted living smart house environments, a single privacy policy, applied throughout, would be either too invasive for an occupant, or too restrictive for an observer, due to the conflicting goals of surveillance and private environments. Hence, we propose a dynamic method for altering the level of privacy in the environment based on the context, the situation within the environment, encompassing factors relevant to ensuring the occupant\u27s safety and privacy. The context is mapped to an appropriate level of privacy, which is implemented by controlling access to data sources (e.g. video) using data hiding techniques. The aim of this work is to decrease the invasiveness of the technology, while retaining the purpose of the system.<br /

    Drones and the Fourth Amendment: Redefining Expectations of Privacy

    Get PDF
    Drones have gained notoriety as a weapon against foreign terrorist targets; yet, they have also recently made headlines as an instrument for domestic surveillance. With their sophisticated capabilities and continuously decreasing costs, it is not surprising that drones have attracted numerous consumers—most notably, law enforcement. Courts will likely soon have to decipher the limits on the government’s use of drones under the Fourth Amendment. But it is unclear where, or even whether, drones would fall under the current jurisprudence. Because of their diverse and sophisticated designs and capabilities, drones might be able to maneuver through the Fourth Amendment’s doctrinal loopholes. This Note advocates analyzing drones under an adapted approach to the reasonable-expectation-of-privacy test in Katz v. United States. Courts should focus more on the test’s oft-neglected first prong—whether a person exhibited a subjective expectation of privacy—and analyze what information falls within the scope of that expectation, excluding information knowingly exposed to the plain view of the public. This analysis also considers instances when, although a subjective expectation exists, it may be impossible or implausible to reasonably exhibit that expectation, a dilemma especially relevant to an analysis of drones. Courts that adopt the recommended analysis would have a coherent and comprehensible approach to factually dynamic cases challenging the constitutionality of drone surveillance. Until then, the constitutional uncertainties of these cases will likely linger

    Privacy in Public and the contextual conditions of agency

    Get PDF
    Current technology and surveillance practices make behaviors traceable to persons in unprecedented ways. This causes a loss of anonymity and of many privacy measures relied on in the past. These de facto privacy losses are by many seen as problematic for individual psychology, intimate relations and democratic practices such as free speech and free assembly. I share most of these concerns but propose that an even more fundamental problem might be that our very ability to act as autonomous and purposive agents relies on some degree of privacy, perhaps particularly as we act in public and semi-public spaces. I suggest that basic issues concerning action choices have been left largely unexplored, due to a series of problematic theoretical assumptions at the heart of privacy debates. One such assumption has to do with the influential conceptualization of privacy as pertaining to personal intimate facts belonging to a private sphere as opposed to a public sphere of public facts. As Helen Nissenbaum has pointed out, the notion of privacy in public sounds almost like an oxymoron given this traditional private-public dichotomy. I discuss her important attempt to defend privacy in public through her concept of ‘contextual integrity.’ Context is crucial, but Nissenbaum’s descriptive notion of existing norms seems to fall short of a solution. I here agree with Joel Reidenberg’s recent worries regarding any approach that relies on ‘reasonable expectations’ . The problem is that in many current contexts we have no such expectations. Our contexts have already lost their integrity, so to speak. By way of a functional and more biologically inspired account, I analyze the relational and contextual dynamics of both privacy needs and harms. Through an understanding of action choice as situated and options and capabilities as relational, a more consequence-oriented notion of privacy begins to appear. I suggest that privacy needs, harms and protections are relational. Privacy might have less to do with seclusion and absolute transactional control than hitherto thought. It might instead hinge on capacities to limit the social consequences of our actions through knowing and shaping our perceptible agency and social contexts of action. To act with intent we generally need the ability to conceal during exposure. If this analysis is correct then relational privacy is an important condition for autonomic purposive and responsible agency—particularly in public space. Overall, this chapter offers a first stab at a reconceptualization of our privacy needs as relational to contexts of action. In terms of ‘rights to privacy’ this means that we should expand our view from the regulation and protection of the information of individuals to questions of the kind of contexts we are creating. I am here particularly interested in what I call ‘unbounded contexts’, i.e. cases of context collapses, hidden audiences and even unknowable future agents

    Bullying10K: A Large-Scale Neuromorphic Dataset towards Privacy-Preserving Bullying Recognition

    Full text link
    The prevalence of violence in daily life poses significant threats to individuals' physical and mental well-being. Using surveillance cameras in public spaces has proven effective in proactively deterring and preventing such incidents. However, concerns regarding privacy invasion have emerged due to their widespread deployment. To address the problem, we leverage Dynamic Vision Sensors (DVS) cameras to detect violent incidents and preserve privacy since it captures pixel brightness variations instead of static imagery. We introduce the Bullying10K dataset, encompassing various actions, complex movements, and occlusions from real-life scenarios. It provides three benchmarks for evaluating different tasks: action recognition, temporal action localization, and pose estimation. With 10,000 event segments, totaling 12 billion events and 255 GB of data, Bullying10K contributes significantly by balancing violence detection and personal privacy persevering. And it also poses a challenge to the neuromorphic dataset. It will serve as a valuable resource for training and developing privacy-protecting video systems. The Bullying10K opens new possibilities for innovative approaches in these domains.Comment: Accepted at the 37th Conference on Neural Information Processing Systems (NeurIPS 2023) Track on Datasets and Benchmark
    • …
    corecore