6 research outputs found

    Exploring lifelog sharing and privacy

    Get PDF
    The emphasis on exhaustive passive capturing of images using wearable cameras like Autographer, which is often known as lifelogging has brought into foreground the challenge of preserving privacy, in addition to presenting the vast amount of images in a meaningful way. In this paper, we present a user-study to understand the importance of an array of factors that are likely to influence the lifeloggers to share their lifelog images in their online circle. The findings are a step forward in the emerging area intersecting HCI, and privacy, to help in exploring design directions for privacy mediating techniques in lifelogging applications

    Die Rolle von Privatheit bei Self-Tracking und Lifelogging: Eine nutzerzentrierte Literaturreview

    Get PDF
    Die Vermessung des eigenen Erlebens sowie die Aufzeichnung der direkten Umwelt liegen derzeit im Trend. Die sogenannte Self-Quantification als auch das sogenannte Lifelogging basieren jedoch grundlegend auf der Sammlung und Verarbeitung von Daten. Daher beleuchtet dieser Übersichtsartikel die bisherigen Studien zur Nutzersicht auf die Privatheit bei Lifelogging und Self-Quantification. Neben den Motivatoren zur Nutzung dieser Praktiken werden mögliche Datenschutzbedenken und -probleme, das Vorhandensein eines Datenschutzbewusstseins sowie mögliche datenschutzwahrende Handlungen beleuchtet. Folgende Studien können auf der Basis dieses Überblicks Forschungslücken identifizieren und zum weiteren Verständnis der Selbstvermessung beitragen

    PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features

    Full text link
    Eyewear devices, such as augmented reality displays, increasingly integrate eye tracking but the first-person camera required to map a user's gaze to the visual scene can pose a significant threat to user and bystander privacy. We present PrivacEye, a method to detect privacy-sensitive everyday situations and automatically enable and disable the eye tracker's first-person camera using a mechanical shutter. To close the shutter in privacy-sensitive situations, the method uses a deep representation of the first-person video combined with rich features that encode users' eye movements. To open the shutter without visual input, PrivacEye detects changes in users' eye movements alone to gauge changes in the "privacy level" of the current situation. We evaluate our method on a first-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity, and show that our method is effective in preserving privacy in this challenging setting.Comment: 10 pages, 6 figures, supplementary materia
    corecore