16,110 research outputs found

    Money Walks: A Human-Centric Study on the Economics of Personal Mobile Data

    Full text link
    In the context of a myriad of mobile apps which collect personally identifiable information (PII) and a prospective market place of personal data, we investigate a user-centric monetary valuation of mobile PII. During a 6-week long user study in a living lab deployment with 60 participants, we collected their daily valuations of 4 categories of mobile PII (communication, e.g. phonecalls made/received, applications, e.g. time spent on different apps, location and media, photos taken) at three levels of complexity (individual data points, aggregated statistics and processed, i.e. meaningful interpretations of the data). In order to obtain honest valuations, we employ a reverse second price auction mechanism. Our findings show that the most sensitive and valued category of personal information is location. We report statistically significant associations between actual mobile usage, personal dispositions, and bidding behavior. Finally, we outline key implications for the design of mobile services and future markets of personal data.Comment: 15 pages, 2 figures. To appear in ACM International Joint Conference on Pervasive and Ubiquitous Computing (Ubicomp 2014

    Privacy Vulnerabilities in the Practices of Repairing Broken Digital Artifacts in Bangladesh

    Get PDF
    This paper presents a study on the privacy concerns associated with the practice of repairing broken digital objects in Bangladesh. Historically, repair of old or broken technologies has received less attention in ICTD scholarship than design, development, or use. As a result, the potential privacy risks associated with repair practices have remained mostly unaddressed. This paper describes our three-month long ethnographic study that took place at ten major repair sites in Dhaka, Bangladesh. We show a variety of ways in which the privacy of an individual’s personal data may be compromised during the repair process. We also examine people’s perceptions around privacy in repair, and its connections with their broader social and cultural values. Finally, we discuss the challenges and opportunities for future research to strengthen the repair ecosystem in developing countries. Taken together, our findings contribute to the growing discourse around post-use cycles of technology

    Understanding face and eye visibility in front-facing cameras of smartphones used in the wild

    Get PDF
    Commodity mobile devices are now equipped with high-resolution front-facing cameras, allowing applications in biometrics (e.g., FaceID in the iPhone X), facial expression analysis, or gaze interaction. However, it is unknown how often users hold devices in a way that allows capturing their face or eyes, and how this impacts detection accuracy. We collected 25,726 in-the-wild photos, taken from the front-facing camera of smartphones as well as associated application usage logs. We found that the full face is visible about 29% of the time, and that in most cases the face is only partially visible. Furthermore, we identified an influence of users' current activity; for example, when watching videos, the eyes but not the entire face are visible 75% of the time in our dataset. We found that a state-of-the-art face detection algorithm performs poorly against photos taken from front-facing cameras. We discuss how these findings impact mobile applications that leverage face and eye detection, and derive practical implications to address state-of-the art's limitations

    The Curious Case of the PDF Converter that Likes Mozart: Dissecting and Mitigating the Privacy Risk of Personal Cloud Apps

    Get PDF
    Third party apps that work on top of personal cloud services such as Google Drive and Dropbox, require access to the user's data in order to provide some functionality. Through detailed analysis of a hundred popular Google Drive apps from Google's Chrome store, we discover that the existing permission model is quite often misused: around two thirds of analyzed apps are over-privileged, i.e., they access more data than is needed for them to function. In this work, we analyze three different permission models that aim to discourage users from installing over-privileged apps. In experiments with 210 real users, we discover that the most successful permission model is our novel ensemble method that we call Far-reaching Insights. Far-reaching Insights inform the users about the data-driven insights that apps can make about them (e.g., their topics of interest, collaboration and activity patterns etc.) Thus, they seek to bridge the gap between what third parties can actually know about users and users perception of their privacy leakage. The efficacy of Far-reaching Insights in bridging this gap is demonstrated by our results, as Far-reaching Insights prove to be, on average, twice as effective as the current model in discouraging users from installing over-privileged apps. In an effort for promoting general privacy awareness, we deploy a publicly available privacy oriented app store that uses Far-reaching Insights. Based on the knowledge extracted from data of the store's users (over 115 gigabytes of Google Drive data from 1440 users with 662 installed apps), we also delineate the ecosystem for third-party cloud apps from the standpoint of developers and cloud providers. Finally, we present several general recommendations that can guide other future works in the area of privacy for the cloud

    ConXsense - Automated Context Classification for Context-Aware Access Control

    Full text link
    We present ConXsense, the first framework for context-aware access control on mobile devices based on context classification. Previous context-aware access control systems often require users to laboriously specify detailed policies or they rely on pre-defined policies not adequately reflecting the true preferences of users. We present the design and implementation of a context-aware framework that uses a probabilistic approach to overcome these deficiencies. The framework utilizes context sensing and machine learning to automatically classify contexts according to their security and privacy-related properties. We apply the framework to two important smartphone-related use cases: protection against device misuse using a dynamic device lock and protection against sensory malware. We ground our analysis on a sociological survey examining the perceptions and concerns of users related to contextual smartphone security and analyze the effectiveness of our approach with real-world context data. We also demonstrate the integration of our framework with the FlaskDroid architecture for fine-grained access control enforcement on the Android platform.Comment: Recipient of the Best Paper Awar

    "If You Can't Beat them, Join them": A Usability Approach to Interdependent Privacy in Cloud Apps

    Get PDF
    Cloud storage services, like Dropbox and Google Drive, have growing ecosystems of 3rd party apps that are designed to work with users' cloud files. Such apps often request full access to users' files, including files shared with collaborators. Hence, whenever a user grants access to a new vendor, she is inflicting a privacy loss on herself and on her collaborators too. Based on analyzing a real dataset of 183 Google Drive users and 131 third party apps, we discover that collaborators inflict a privacy loss which is at least 39% higher than what users themselves cause. We take a step toward minimizing this loss by introducing the concept of History-based decisions. Simply put, users are informed at decision time about the vendors which have been previously granted access to their data. Thus, they can reduce their privacy loss by not installing apps from new vendors whenever possible. Next, we realize this concept by introducing a new privacy indicator, which can be integrated within the cloud apps' authorization interface. Via a web experiment with 141 participants recruited from CrowdFlower, we show that our privacy indicator can significantly increase the user's likelihood of choosing the app that minimizes her privacy loss. Finally, we explore the network effect of History-based decisions via a simulation on top of large collaboration networks. We demonstrate that adopting such a decision-making process is capable of reducing the growth of users' privacy loss by 70% in a Google Drive-based network and by 40% in an author collaboration network. This is despite the fact that we neither assume that users cooperate nor that they exhibit altruistic behavior. To our knowledge, our work is the first to provide quantifiable evidence of the privacy risk that collaborators pose in cloud apps. We are also the first to mitigate this problem via a usable privacy approach.Comment: Authors' extended version of the paper published at CODASPY 201

    Evaluating 'Prefer not to say' Around Sensitive Disclosures

    Get PDF
    As people's offline and online lives become increasingly entwined, the sensitivity of personal information disclosed online is increasing. Disclosures often occur through structured disclosure fields (e.g., drop-down lists). Prior research suggests these fields may limit privacy, with non-disclosing users being presumed to be hiding undesirable information. We investigated this around HIV status disclosure in online dating apps used by men who have sex with men. Our online study asked participants (N=183) to rate profiles where HIV status was either disclosed or undisclosed. We tested three designs for displaying undisclosed fields. Visibility of undisclosed fields had a significant effect on the way profiles were rated, and other profile information (e.g., ethnicity) could affect inferences that develop around undisclosed information. Our research highlights complexities around designing for non-disclosure and questions the voluntary nature of these fields. Further work is outlined to ensure disclosure control is appropriately implemented around online sensitive information disclosures

    After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance

    Get PDF
    Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumers’ privacy expectations and thereby, become “over-privileged” to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem

    Encouraging Privacy-Aware Smartphone App Installation: Finding out what the Technically-Adept Do

    Get PDF
    Smartphone apps can harvest very personal details from the phone with ease. This is a particular privacy concern. Unthinking installation of untrustworthy apps constitutes risky behaviour. This could be due to poor awareness or a lack of knowhow: knowledge of how to go about protecting privacy. It seems that Smartphone owners proceed with installation, ignoring any misgivings they might have, and thereby irretrievably sacrifice their privacy
    • …
    corecore