46,112 research outputs found
After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance
Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumersâ privacy expectations and thereby, become âover-privilegedâ to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem
How Dangerous Permissions are Described in Android Apps' Privacy Policies?
Google requires Android apps which handle users' personal data such as photos and contacts information to post a privacy policy which describes comprehensively how the app collects, uses and shares users' information. Unfortunately, while knowing why the app wants to access specific users' information is considered very useful, permissions screen in Android does not provide such pieces of information. Accordingly, users reported their concerns about apps requiring permissions that seem to be not related to the apps' functions. To advance toward practical solutions that can assist users in protecting their privacy, a technique to automatically discover the rationales of dangerous permissions requested by Android apps, by extracting them from apps' privacy policies, could be a great advantage. However, before being able to do so, it is important to bridge the gap between technical terms used in Android permissions and natural language terminology in privacy policies. In this paper, we recorded the terminology used in Android apps' privacy policies which describe usage of dangerous permissions. The semi-automated approach employs NLP and IE techniques to map privacy policies' terminologies to Android dangerous permissions. The mapping links 128 information types to Android dangerous permissions. This mapping produces semantic information which can then be used to extract the rationales of dangerous permissions from apps' privacy policies
Longitudinal Analysis of Android Ad Library Permissions
This paper investigates changes over time in the behavior of Android ad
libraries. Taking a sample of 100,000 apps, we extract and classify the ad
libraries. By considering the release dates of the applications that use a
specific ad library version, we estimate the release date for the library, and
thus build a chronological map of the permissions used by various ad libraries
over time. We find that the use of most permissions has increased over the last
several years, and that more libraries are able to use permissions that pose
particular risks to user privacy and security.Comment: Most 201
Trust and Privacy Permissions for an Ambient World
Ambient intelligence (AmI) and ubiquitous computing allow us to consider a future where computation is embedded into our daily social lives. This vision raises its own important questions and augments the need to understand how people will trust such systems and at the same time achieve and maintain privacy. As a result, we have recently conducted a wide reaching study of peopleâs attitudes to potential AmI scenarios with a view to eliciting their privacy concerns. This chapter describes recent research related to privacy and trust with regard to ambient technology. The method used in the study is described and findings discussed
The privacy and control paradoxes in the context of smartphone apps
This research examines how various factors, such as the degree of e-privacy concerns and control over data access permissions, can influence a user's intention to install a smartphone app. We conducted two survey-based experiments with 441 participants. In each experiment, we manipulated the degree of control over the number and type of data access permissions granted to different fictional apps. In Study 1, participants were informed about the set of permissions the apps required. In Study 2, participants indicated which individual permissions they were willing to grant to the apps. In both experiments, we assessed the level of e-privacy concerns, perceived app importance, and the intention to install the apps. The results suggest that the type of app plays a central role in determining both the perceived benefit of installing the app and the level of e-privacy concerns. The intention to install an app is more strongly associated with perceived app importance than with e-privacy concerns (especially when app importance is high, and users have explicit control over which specific data access permissions they want to grant). The implications of these results are discussed regarding psychological factors involved in app installation decision-making process and the importance of promoting data protection by design
When private information settles the bill : money and privacy in Google's market for smartphone applications
We shed light on a money-for-privacy trade-off in the market for smartphone applications (âappsâ). Developers offer their apps cheaper in return for greater access to personal information, and consumers choose between lower prices and more privacy. We provide evidence for this pattern using data on 300,000 mobile applications which were obtained from the Android Market in 2012 and 2014. We augmented these data with information from Alexa.com and Amazon Mechanical Turk. Our findings show that both the marketâs supply and the demand side consider an appâs ability to collect private information, measured by their use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) installation numbers are lower for apps with sensitive permissions; (3) circumstantial factors,
such as the reputation of app developers, mitigate the strength of this relationship. Our results
emerge consistently across several robustness checks, including the use of panel data analysis, the use of selected matched âtwinâ-pairs of apps and the use of various alternative measures of privacy-sensitiveness
- âŠ