752 research outputs found

    Incorporation of privacy elements in space station design

    Get PDF
    Privacy exists to the extent that individuals can control the degree of social contact that they have with one another. The opportunity to withdraw from other people serves a number of important psychological and social functions, and is in the interests of safety, high performance, and high quality of human life. Privacy requirements for Space Station crew members are reviewed, and architectual and other guidelines for helping astronauts achieve desired levels of privacy are suggested. In turn, four dimensions of privacy are discussed: the separation of activities by areas within the Space Station, controlling the extent to which astronauts have visual contact with one another, controlling the extent to which astronauts have auditory contact with one another, and odor control. Each section presents a statement of the problem, a review of general solutions, and specific recommendations. The report is concluded with a brief consideration of how selection, training, and other procedures can also help Space Station occupants achieve satisfactory levels of seclusion

    A framework for objective evaluation of privacy filters in video surveillance

    Get PDF
    Extensive adoption of video surveillance, affecting many aspects of our daily lives, alarms the public about the increasing invasion into personal privacy. To address these concerns, many tools have been proposed for protection of personal privacy in image and video. However, little is understood regarding the effectiveness of such tools and especially their impact on the underlying surveillance tasks, leading to a tradeoff between the preservation of privacy offered by these tools and the intelligibility of activities under video surveillance. In this paper, we investigate this privacy-intelligibility tradeoff objectively by proposing an objective framework for evaluation of privacy filters. We apply the proposed framework on a use case where privacy of people is protected by obscuring faces, assuming an automated video surveillance system. We used several popular privacy protection filters, such as blurring, pixelization, and masking and applied them with varying strengths to people's faces from different public datasets of video surveillance footage. Accuracy of face detection algorithm was used as a measure of intelligibility (a face should be detected to perform a surveillance task), and accuracy of face recognition algorithm as a measure of privacy (a specific person should not be identified). Under these conditions, after application of an ideal privacy protection tool, an obfuscated face would be visible as a face but would not be correctly identified by the recognition algorithm. The experiments demonstrate that, in general, an increase in strength of privacy filters under consideration leads to an increase in privacy (i.e., reduction in recognition accuracy) and to a decrease in intelligibility (i.e., reduction in detection accuracy). Masking also shows to be the most favorable filter across all tested datasets

    Crowdsourcing Approach for Evaluation of Privacy Filters in Video Surveillance

    Get PDF
    Extensive adoption of video surveillance, affecting many aspects of the daily life, alarms the concerned public about the increasing invasion into personal privacy. To address these concerns, many tools have been proposed for protection of personal privacy in image and video. However, little is understood regarding the effectiveness of such tools and especially their impact on the underlying surveillance tasks. In this paper, we propose conducting a subjective evaluation using crowdsourcing to analyze the tradeoff between the preservation of privacy offered by these tools and the intelligibility of activities under video surveillance. As an example, the proposed method is used to compare several commonly employed privacy protection techniques, such as blurring, pixelization, and masking applied to indoor surveillance video. Facebook based crowdsourcing application was specifically developed to gather the subjective evaluation data. Based on more than one hundred participants, the evaluation results demonstrate that the pixelization filter provides the best performance in terms of balance between privacy protection and intelligibility. The results obtained with crowdsourcing application were compared with results of previous work using more conventional subjective tests showing that they are highly correlated

    MediaEval 2014 Visual Privacy Task: Geometrical Privacy Protection Tool

    Get PDF
    This paper describes EPFL privacy protection tool for the MediaEval 2014 Visual Privacy task. The goal of the task is to obscure faces, body silhouettes, and personal items of people in the provided surveillance clips to preserve their personal privacy. The EPFL privacy protection tool mainly relies on two privacy protection filters: a warping-based reversible filter to obscure features with low visual details (body silhouettes) by distorting them with randomized warping and morphing-based reversible filter to obscure features with high visual details (faces and personal items) by ‘replacing’ them with a graphical representation. The aim of this tool is to achieve an acceptable balance between privacy (how well the privacy is protected) and intelligibility (how well the surveillance task can still be performed), as well as, privacy and pleasantness (how pleasant is the protection). The results of three types of subjective evaluations, via crowdsourcing, practitioners, and stakeholders, provided by the organizers of the task demonstrates that EPFL privacy protection tool achieves a great overall balance between privacy, intelligibility, and pleasantness, while being secure and reversible in the same time

    Privacy Protection for Life-log System

    Get PDF
    Tremendous advances in wearable computing and storage technologies enable us to record not just snapshots of an event but the whole human experience for a long period of time. Such a \life-logandamp;quot; system captures important events as they happen, rather than an after-thought. Such a system has applications in many areas such as law enforcement, personal archives, police questioning, and medicine. Much of the existing eandamp;reg;orts focus on the pattern recognition and information retrieval aspects of the system. On the other hand, the privacy issues raised by such an intrusive system have not received much attention from the research community. The objectives of this research project are two-fold: andamp;macr;rst, to construct a wearable life-log video system, and second, to provide a solution for protecting the identity of the subjects in the video while keeping the video useful. In this thesis work, we designed a portable wearable life-log system that implements audio distortion and face blocking in a real time to protect the privacy of the subjects who are being recorded in life-log video. For audio, our system automatically isolates the subject\u27s speech and distorts it using a pitch- shifting algorithm to conceal the identity. For video, our system uses a real-time face detection, tracking and blocking algorithm to obfuscate the faces of the subjects. Extensive experiments have been conducted on interview videos to demonstrate the ability of our system in protecting the identity of the subject while maintaining the usability of the life-log video

    UHD Video Dataset for Evaluation of Privacy

    Get PDF
    Ultra High Definition (UHD) is one of the emerging immersive video technologies already available to public, as even some of the smartphones are capable of capturing UHD video. The increasingly widespread availability of UHD capable recording devices has important implications on privacy. This paper addresses the problem by proposing a publicly available UHD video dataset designed for evaluation of privacy issues. The dataset depicts typical surveillance scenarios of people fighting, exchanging bags, walking, and stealing, in an indoor and outdoor environment. The dataset also includes the data from the subjective assessment, which evaluated the impact of UHD on privacy compared to a currently common High Definition (HD) video and declining Standard Definition (SD) video. The results of the assessment not only demonstrate that UHD is a significantly more privacy intrusive technology when compared to HD and SD used today, but they also quantify the impact of intrusiveness

    De-identification for privacy protection in multimedia content : A survey

    Get PDF
    This document is the Accepted Manuscript version of the following article: Slobodan Ribaric, Aladdin Ariyaeeinia, and Nikola Pavesic, ‘De-identification for privacy protection in multimedia content: A survey’, Signal Processing: Image Communication, Vol. 47, pp. 131-151, September 2016, doi: https://doi.org/10.1016/j.image.2016.05.020. This manuscript version is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License CC BY NC-ND 4.0 (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.Privacy is one of the most important social and political issues in our information society, characterized by a growing range of enabling and supporting technologies and services. Amongst these are communications, multimedia, biometrics, big data, cloud computing, data mining, internet, social networks, and audio-video surveillance. Each of these can potentially provide the means for privacy intrusion. De-identification is one of the main approaches to privacy protection in multimedia contents (text, still images, audio and video sequences and their combinations). It is a process for concealing or removing personal identifiers, or replacing them by surrogate personal identifiers in personal information in order to prevent the disclosure and use of data for purposes unrelated to the purpose for which the information was originally obtained. Based on the proposed taxonomy inspired by the Safe Harbour approach, the personal identifiers, i.e., the personal identifiable information, are classified as non-biometric, physiological and behavioural biometric, and soft biometric identifiers. In order to protect the privacy of an individual, all of the above identifiers will have to be de-identified in multimedia content. This paper presents a review of the concepts of privacy and the linkage among privacy, privacy protection, and the methods and technologies designed specifically for privacy protection in multimedia contents. The study provides an overview of de-identification approaches for non-biometric identifiers (text, hairstyle, dressing style, license plates), as well as for the physiological (face, fingerprint, iris, ear), behavioural (voice, gait, gesture) and soft-biometric (body silhouette, gender, age, race, tattoo) identifiers in multimedia documents.Peer reviewe

    Evaluation of privacy in high dynamic range video sequences

    Get PDF
    The ability of high dynamic range (HDR) to capture details in environments with high contrast has a significant impact on privacy in video surveillance. However, the extent to which HDR imaging affects privacy, when compared to a typical low dynamic range (LDR) imaging, is neither well studied nor well understood. To achieve such an objective, a suitable dataset of images and video sequences is needed. Therefore, we have created a publicly available dataset of HDR video for privacy evaluation PEViD-HDR, which is an HDR extension of an existing Privacy Evaluation Video Dataset (PEViD). PEViD-HDR video dataset can help in the evaluations of privacy protection tools, as well as for showing the importance of HDR imaging in video surveillance applications and its influence on the privacy-intelligibility trade-off. We conducted a preliminary subjective experiment demonstrating the usability of the created dataset for evaluation of privacy issues in video. The results confirm that a tone-mapped HDR video contains more privacy sensitive information and details compared to a typical LDR video

    The impact of privacy protection filters on gender recognition

    Get PDF
    Deep learning-based algorithms have become increasingly efficient in recognition and detection tasks, especially when they are trained on large-scale datasets. Such recent success has led to a speculation that deep learning methods are comparable to or even outperform human visual system in its ability to detect and recognize objects and their features. In this paper, we focus on the specific task of gender recognition in images when they have been processed by privacy protection filters (e.g., blurring, masking, and pixelization) applied at different strengths. Assuming a privacy protection scenario, we compare the performance of state of the art deep learning algorithms with a subjective evaluation obtained via crowdsourcing to understand how privacy protection filters affect both machine and human vision
    • 

    corecore