1,670 research outputs found
The Curious Case of the PDF Converter that Likes Mozart: Dissecting and Mitigating the Privacy Risk of Personal Cloud Apps
Third party apps that work on top of personal cloud services such as Google
Drive and Dropbox, require access to the user's data in order to provide some
functionality. Through detailed analysis of a hundred popular Google Drive apps
from Google's Chrome store, we discover that the existing permission model is
quite often misused: around two thirds of analyzed apps are over-privileged,
i.e., they access more data than is needed for them to function. In this work,
we analyze three different permission models that aim to discourage users from
installing over-privileged apps. In experiments with 210 real users, we
discover that the most successful permission model is our novel ensemble method
that we call Far-reaching Insights. Far-reaching Insights inform the users
about the data-driven insights that apps can make about them (e.g., their
topics of interest, collaboration and activity patterns etc.) Thus, they seek
to bridge the gap between what third parties can actually know about users and
users perception of their privacy leakage. The efficacy of Far-reaching
Insights in bridging this gap is demonstrated by our results, as Far-reaching
Insights prove to be, on average, twice as effective as the current model in
discouraging users from installing over-privileged apps. In an effort for
promoting general privacy awareness, we deploy a publicly available privacy
oriented app store that uses Far-reaching Insights. Based on the knowledge
extracted from data of the store's users (over 115 gigabytes of Google Drive
data from 1440 users with 662 installed apps), we also delineate the ecosystem
for third-party cloud apps from the standpoint of developers and cloud
providers. Finally, we present several general recommendations that can guide
other future works in the area of privacy for the cloud
Mobile Privacy and Business-to-Platform Dependencies: An Analysis of SEC Disclosures
This Article systematically examines the dependence of mobile apps on mobile platforms for the collection and use of personal information through an analysis of Securities and Exchange Commission (SEC) filings of mobile app companies. The Article uses these disclosures to find systematic evidence of how app business models are shaped by the governance of user data by mobile platforms, in order to reflect on the role of platforms in privacy regulation more generally. The analysis of SEC filings documented in the Article produces new and unique insights into the data practices and data-related aspects of the business models of popular mobile apps and shows the value of SEC filings for privacy law and policy research more generally. The discussion of SEC filings and privacy builds on regulatory developments in SEC disclosures and cybersecurity of the last decade. The Article also connects to recent regulatory developments in the U.S. and Europe, including the General Data Protection Regulation, the proposals for a new ePrivacy Regulation and a Regulation of fairness in business-to-platform relations
Ethical guidelines for nudging in information security & privacy
There has recently been an upsurge of interest in the deployment of behavioural economics techniques in the information security and privacy domain. In this paper, we consider first the nature of one particular intervention, the nudge, and the way it exercises its influence. We contemplate the ethical ramifications of nudging, in its broadest sense, deriving general principles for ethical nudging from the literature. We extrapolate these principles to the deployment of nudging in information security and privacy. We explain how researchers can use these guidelines to ensure that they satisfy the ethical requirements during nudge trials in information security and privacy. Our guidelines also provide guidance to ethics review boards that are required to evaluate nudge-related research
A call for responsible innovation in mobile mental health:findings from a content analysis and ethical review of the depression app marketplace
Mobile mental health presents many ethical challenges in the wild. These ethical issues and associated values were explored through a content analysis and ethical review of the depression app marketplace. App search and data collection was performed in Google Play Store (UK) and Apple iTunes (UK) between October to November 2018. Iterative data extraction and coding of ethical variables and values were conducted prior to synthetization of issues and themes. Search found 353 unique apps for depression. Analysis uncovered a range of ethical issues including: limited evidence of intervention validity, fidelity, and outcomes; insufficient safeguarding and duty of care; non-multisector development teams; lack of independent certification and regulation; lack of information and transparency for informed user choices; and concerns with privacy, confidentiality, and user permissions. These findings highlighted the presence and absence of ethical values in apps for depression, with most apps failing to reflect many key values. Our findings suggest a need for greater ethical value sensitive design in mobile mental health. This is challenging given the fieldâs multidisciplinarity and value conflicts. We encourage designers to adopt a responsible innovation approach to creating technologies that meet these ethical demands
âAnd all the pieces matter...â Hybrid Testing Methods for Android App's Privacy Analysis
Smartphones have become inherent to the every day life of billions of people worldwide, and they
are used to perform activities such as gaming, interacting with our peers or working. While extremely
useful, smartphone apps also have drawbacks, as they can affect the security and privacy of users.
Android devices hold a lot of personal data from users, including their social circles (e.g., contacts),
usage patterns (e.g., app usage and visited websites) and their physical location. Like in most software
products, Android apps often include third-party code (Software Development Kits or SDKs) to
include functionality in the app without the need to develop it in-house. Android apps and third-party
components embedded in them are often interested in accessing such data, as the online ecosystem
is dominated by data-driven business models and revenue streams like advertising.
The research community has developed many methods and techniques for analyzing the privacy
and security risks of mobile apps, mostly relying on two techniques: static code analysis and dynamic
runtime analysis. Static analysis analyzes the code and other resources of an app to detect potential
app behaviors. While this makes static analysis easier to scale, it has other drawbacks such as
missing app behaviors when developers obfuscate the appâs code to avoid scrutiny. Furthermore,
since static analysis only shows potential app behavior, this needs to be confirmed as it can also
report false positives due to dead or legacy code. Dynamic analysis analyzes the apps at runtime to
provide actual evidence of their behavior. However, these techniques are harder to scale as they need
to be run on an instrumented device to collect runtime data. Similarly, there is a need to stimulate
the app, simulating real inputs to examine as many code-paths as possible. While there are some
automatic techniques to generate synthetic inputs, they have been shown to be insufficient.
In this thesis, we explore the benefits of combining static and dynamic analysis techniques to
complement each other and reduce their limitations. While most previous work has often relied on
using these techniques in isolation, we combine their strengths in different and novel ways that allow
us to further study different privacy issues on the Android ecosystem. Namely, we demonstrate the
potential of combining these complementary methods to study three inter-related issues:
⢠A regulatory analysis of parental control apps. We use a novel methodology that relies on
easy-to-scale static analysis techniques to pin-point potential privacy issues and violations of
current legislation by Android apps and their embedded SDKs. We rely on the results from our
static analysis to inform the way in which we manually exercise the apps, maximizing our ability
to obtain real evidence of these misbehaviors. We study 46 publicly available apps and find
instances of data collection and sharing without consent and insecure network transmissions
containing personal data. We also see that these apps fail to properly disclose these practices
in their privacy policy.
⢠A security analysis of the unauthorized access to permission-protected data without user consent.
We use a novel technique that combines the strengths of static and dynamic analysis, by
first comparing the data sent by applications at runtime with the permissions granted to each
app in order to find instances of potential unauthorized access to permission protected data.
Once we have discovered the apps that are accessing personal data without permission, we
statically analyze their code in order to discover covert- and side-channels used by apps and SDKs to circumvent the permission system. This methodology allows us to discover apps using
the MAC address as a surrogate for location data, two SDKs using the external storage as a
covert-channel to share unique identifiers and an app using picture metadata to gain unauthorized
access to location data.
⢠A novel SDK detection methodology that relies on obtaining signals observed both in the appâs
code and static resources and during its runtime behavior. Then, we rely on a tree structure
together with a confidence based system to accurately detect SDK presence without the need
of any a priory knowledge and with the ability to discern whether a given SDK is part of legacy
or dead code. We prove that this novel methodology can discover third-party SDKs with more
accuracy than state-of-the-art tools both on a set of purpose-built ground-truth apps and on a
dataset of 5k publicly available apps.
With these three case studies, we are able to highlight the benefits of combining static and dynamic
analysis techniques for the study of the privacy and security guarantees and risks of Android
apps and third-party SDKs. The use of these techniques in isolation would not have allowed us to
deeply investigate these privacy issues, as we would lack the ability to provide real evidence of potential
breaches of legislation, to pin-point the specific way in which apps are leveraging cover and side
channels to break Androidâs permission system or we would be unable to adapt to an ever-changing
ecosystem of Android third-party companies.The works presented in this thesis were partially funded within the framework of the following projects
and grants:
⢠European Unionâs Horizon 2020 Innovation Action program (Grant Agreement No. 786741,
SMOOTH Project and Grant Agreement No. 101021377, TRUST AWARE Project).
⢠Spanish Government ODIO NºPID2019-111429RB-C21/PID2019-111429RBC22.
⢠The Spanish Data Protection Agency (AEPD)
⢠AppCensus Inc.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en IngenierĂa TelemĂĄtica por la Universidad Carlos III de MadridPresidente: Srdjan Matic.- Secretario: Guillermo SuĂĄrez-Tangil.- Vocal: Ben Stoc
- âŚ