1,972 research outputs found

    PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features

    Full text link
    Eyewear devices, such as augmented reality displays, increasingly integrate eye tracking but the first-person camera required to map a user's gaze to the visual scene can pose a significant threat to user and bystander privacy. We present PrivacEye, a method to detect privacy-sensitive everyday situations and automatically enable and disable the eye tracker's first-person camera using a mechanical shutter. To close the shutter in privacy-sensitive situations, the method uses a deep representation of the first-person video combined with rich features that encode users' eye movements. To open the shutter without visual input, PrivacEye detects changes in users' eye movements alone to gauge changes in the "privacy level" of the current situation. We evaluate our method on a first-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity, and show that our method is effective in preserving privacy in this challenging setting.Comment: 10 pages, 6 figures, supplementary materia

    Facing the Future: Protecting Human Rights in Policy Strategies for Facial Recognition Technology in Law Enforcement

    Get PDF

    VIMES : A Wearable Memory Assistance System for Automatic Information Retrieval

    Get PDF
    The advancement of artificial intelligence and wearable computing triggers the radical innovation of cognitive applications. In this work, we propose VIMES, an augmented reality-based memory assistance system that helps recall declarative memory, such as whom the user meets and what they chat. Through a collaborative method with 20 participants, we design VIMES, a system that runs on smartglasses, takes the first-person audio and video as input, and extracts personal profiles and event information to display on the embedded display or a smartphone. We perform an extensive evaluation with 50 participants to show the effectiveness of VIMES for memory recall. VIMES outperforms (90% memory accuracy) other traditional methods such as self-recall (34%) while offering the best memory experience (Vividness, Coherence, and Visual Perspective all score over 4/5). The user study results show that most participants find VIMES useful (3.75/5) and easy to use (3.46/5).Peer reviewe

    Examining power dynamics and user privacy in smart technology use among Jordanian households

    Get PDF
    Smart technologies continue to raise concerns about privacy protection of both households and bystanders who may be targets of incidental or intentional monitoring. Domestic workers are an example of bystanders in smart homes who experience complex power dynamics and can be subjected to exploitative practices that are further facilitated through smart technology. Such power dynamics are rooted in complex social norms and customs, religious beliefs, and economics. While past research has focused on Western contexts to explore how smart technologies and power dynamics affect privacy of households and smart home bystanders, there is a limited understanding of the impact of such factors within non-Western contexts. This paper presents the findings from 30 interviews with smart device users and bystanders (households, and domestic workers), policy makers, and human and civil rights activists to explore smart home power dynamics in the Muslim Arab Middle Eastern (MAME) context of Jordan. We uncover how asymmetric socio-economic power dynamics between households and domestic workers influence smart technology privacy concerns, practices, and rights perceptions. Drawing on the findings of this study, we present some recommendations for interventions to balance asymmetric power dynamics, to improve bystanders’ agency and privacy protection, and to prevent technology exploitation

    Understanding factors behind IoT privacy -- A user's perspective on RF sensors

    Full text link
    While IoT sensors in physical spaces have provided utility and comfort in our lives, their instrumentation in private and personal spaces has led to growing concerns regarding privacy. The existing notion behind IoT privacy is that the sensors whose data can easily be understood and interpreted by humans (such as cameras) are more privacy-invasive than sensors that are not human-understandable, such as RF (radio-frequency) sensors. However, given recent advancements in machine learning, we can not only make sensitive inferences on RF data but also translate between modalities. Thus, the existing notions of privacy for IoT sensors need to be revisited. In this paper, our goal is to understand what factors affect the privacy notions of a non-expert user (someone who is not well-versed in privacy concepts). To this regard, we conduct an online study of 162 participants from the USA to find out what factors affect the privacy perception of a user regarding an RF-based device or a sensor. Our findings show that a user's perception of privacy not only depends upon the data collected by the sensor but also on the inferences that can be made on that data, familiarity with the device and its form factor as well as the control a user has over the device design and its data policies. When the data collected by the sensor is not human-interpretable, it is the inferences that can be made on the data and not the data itself that users care about when making informed decisions regarding device privacy

    A Design Space for Effective Privacy Notices.

    Get PDF
    ABSTRACT Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy poli cies, are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home de vices connected in an Internet of Things exacerbate the is sue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and develop ers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contribu tions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent ter minology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and de velop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice

    A Design Space for Effective Privacy Notices.

    Get PDF
    ABSTRACT Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy policies, are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home devices connected in an Internet of Things exacerbate the issue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and developers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contributions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent terminology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and develop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice

    Privacy-enhancing technology and everyday augmented reality : understanding bystanders' varying needs for awareness and consent

    Get PDF
    Fundamental to Augmented Reality (AR) headsets is their capacity to visually and aurally sense the world around them, necessary to drive the positional tracking that makes rendering 3D spatial content possible. This requisite sensing also opens the door for more advanced AR-driven activities, such as augmented perception, volumetric capture and biometric identification - activities with the potential to expose bystanders to significant privacy risks. Existing Privacy-Enhancing Technologies (PETs) often safeguard against these risks at a low level e.g., instituting camera access controls. However, we argue that such PETs are incompatible with the need for always-on sensing given AR headsets' intended everyday use. Through an online survey (N=102), we examine bystanders' awareness of, and concerns regarding, potentially privacy infringing AR activities; the extent to which bystanders' consent should be sought; and the level of granularity of information necessary to provide awareness of AR activities to bystanders. Our findings suggest that PETs should take into account the AR activity type, and relationship to bystanders, selectively facilitating awareness and consent. In this way, we can ensure bystanders feel their privacy is respected by everyday AR headsets, and avoid unnecessary rejection of these powerful devices by society
    corecore