4,372 research outputs found
The Curious Case of the PDF Converter that Likes Mozart: Dissecting and Mitigating the Privacy Risk of Personal Cloud Apps
Third party apps that work on top of personal cloud services such as Google
Drive and Dropbox, require access to the user's data in order to provide some
functionality. Through detailed analysis of a hundred popular Google Drive apps
from Google's Chrome store, we discover that the existing permission model is
quite often misused: around two thirds of analyzed apps are over-privileged,
i.e., they access more data than is needed for them to function. In this work,
we analyze three different permission models that aim to discourage users from
installing over-privileged apps. In experiments with 210 real users, we
discover that the most successful permission model is our novel ensemble method
that we call Far-reaching Insights. Far-reaching Insights inform the users
about the data-driven insights that apps can make about them (e.g., their
topics of interest, collaboration and activity patterns etc.) Thus, they seek
to bridge the gap between what third parties can actually know about users and
users perception of their privacy leakage. The efficacy of Far-reaching
Insights in bridging this gap is demonstrated by our results, as Far-reaching
Insights prove to be, on average, twice as effective as the current model in
discouraging users from installing over-privileged apps. In an effort for
promoting general privacy awareness, we deploy a publicly available privacy
oriented app store that uses Far-reaching Insights. Based on the knowledge
extracted from data of the store's users (over 115 gigabytes of Google Drive
data from 1440 users with 662 installed apps), we also delineate the ecosystem
for third-party cloud apps from the standpoint of developers and cloud
providers. Finally, we present several general recommendations that can guide
other future works in the area of privacy for the cloud
After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance
Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumersâ privacy expectations and thereby, become âover-privilegedâ to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem
Ethical guidelines for nudging in information security & privacy
There has recently been an upsurge of interest in the deployment of behavioural economics techniques in the information security and privacy domain. In this paper, we consider first the nature of one particular intervention, the nudge, and the way it exercises its influence. We contemplate the ethical ramifications of nudging, in its broadest sense, deriving general principles for ethical nudging from the literature. We extrapolate these principles to the deployment of nudging in information security and privacy. We explain how researchers can use these guidelines to ensure that they satisfy the ethical requirements during nudge trials in information security and privacy. Our guidelines also provide guidance to ethics review boards that are required to evaluate nudge-related research
Reengineering the user: Privacy concerns about personal data on smartphones.
Purpose: This paper aims to discuss the privacy and security concerns that have risen from the permissions model in the Android operating system, along with two shortcomings that have not been adequately addressed.
Design/methodology/approach: The impact of the applicationsâ evolutionary increment of permission requests from both the userâs and the developerâs point of view is studied, and finally, a series of remedies against the erosion of usersâ privacy is proposed.
Findings: The results of this work indicate that, even though providing access to personal data of smartphone users is by definition neither problematic nor unlawful, todayâs smartphone operating systems do not provide an adequate level of protection for the userâs personal data. However, there are several ideas that can significantly improve the situation and mitigate privacy concerns of users of smart devices.
Research limitations/implications: The proposed approach was evaluated through an examination of the Androidâs permission model, although issues arise in other operating systems. The authorsâ future intention is to conduct a user study to measure the userâs awareness and concepts surrounding privacy concerns to empirically investigate the above-mentioned suggestions.
Practical implications: The proposed suggestions in this paper, if adopted in practice, could significantly improve the situation and mitigate privacy concerns of users of smart devices.
Social implications: The recommendations proposed in this paper would strongly enhance the control of users over their personal data and improve their ability to distinguish legitimate apps from malware or grayware.
Originality/value: This paper emphasises two shortcomings of the permissions models of mobile operating systems which, in authorsâ view, have not been adequately addressed to date and propose an inherent way for apps and other entities of the mobile computing ecosystem to commit to responsible and transparent practices on mobile usersâ privacy
Recommended from our members
Disposition toward privacy and information disclosure in the context of emerging health technologies.
ObjectiveWe sought to present a model of privacy disposition and its development based on qualitative research on privacy considerations in the context of emerging health technologies.Materials and methodsWe spoke to 108 participants across 44 interviews and 9 focus groups to understand the range of ways in which individuals value (or do not value) control over their health information. Transcripts of interviews and focus groups were systematically coded and analyzed in ATLAS.ti for privacy considerations expressed by respondents.ResultsThree key findings from the qualitative data suggest a model of privacy disposition. First, participants described privacy related behavior as both contextual and habitual. Second, there are motivations for and deterrents to sharing personal information that do not fit into the analytical categories of risks and benefits. Third, philosophies of privacy, often described as attitudes toward privacy, should be classified as a subtype of motivation or deterrent.DiscussionThis qualitative analysis suggests a simple but potentially powerful conceptual model of privacy disposition, or what makes a person more or less private. Components of privacy disposition are identifiable and measurable through self-report and therefore amenable to operationalization and further quantitative inquiry.ConclusionsWe propose this model as the basis for a psychometric instrument that can be used to identify types of privacy dispositions, with potential applications in research, clinical practice, system design, and policy
Privacy Leakage in Mobile Computing: Tools, Methods, and Characteristics
The number of smartphones, tablets, sensors, and connected wearable devices
are rapidly increasing. Today, in many parts of the globe, the penetration of
mobile computers has overtaken the number of traditional personal computers.
This trend and the always-on nature of these devices have resulted in
increasing concerns over the intrusive nature of these devices and the privacy
risks that they impose on users or those associated with them. In this paper,
we survey the current state of the art on mobile computing research, focusing
on privacy risks and data leakage effects. We then discuss a number of methods,
recommendations, and ongoing research in limiting the privacy leakages and
associated risks by mobile computing
Human-Centered Design for Data-Sparse Tailored Privacy Information Provision
One way to reduce privacy risks for consumers when using the internet is to inform them better about the privacy practices they will encounter. Tailored privacy information provision could outperform the current practice where information system providers do not much more than posting unwieldy privacy notices. Paradoxically, this would require additional collection of data about consumers\u27 privacy preferencesâwhich constitute themselves sensitive information so that sharing them may expose consumers to additional privacy risks. This chapter presents insights on how this paradoxical interplay can be outmaneuvered. We discuss different approaches for privacy preference elicitation, the data required, and how to best protect the sensitive data inevitably to be shared with technical privacy-preserving mechanisms. The key takeaway of this chapter is that we should put more thought into what we are building and using our systems for to allow for privacy through human-centered design instead of static, predefined solutions which do not meet consumer needs
Data, data, and even more data:Empowering users to make well-informed decisions about online privacy
- âŠ