3,298 research outputs found
On the security of mobile sensors
PhD ThesisThe age of sensor technology is upon us. Sensor-rich mobile devices
are ubiquitous. Smart-phones, tablets, and wearables are increasingly
equipped with sensors such as GPS, accelerometer, Near Field Communication
(NFC), and ambient sensors. Data provided by such sensors, combined
with the fast-growing computational capabilities on mobile platforms,
offer richer and more personalised apps. However, these sensors
introduce new security challenges to the users, and make sensor management
more complicated.
In this PhD thesis, we contribute to the field of mobile sensor security by
investigating a wide spectrum of open problems in this field covering attacks
and defences, standardisation and industrial approaches, and human
dimensions. We study the problems in detail and propose solutions.
First, we propose âTap-Tap and Payâ (TTP), a sensor-based protocol to
prevent the Mafia attack in NFC payment. The Mafia attack is a special
type of Man-In-The-Middle attack which charges the user for something
more expensive than what she intends to pay by relaying transactions
to a remote payment terminal. In TTP, a user initiates the payment by
physically tapping her mobile phone against the reader. We observe that
this tapping causes transient vibrations at both devices which are measurable
by the embedded accelerometers. Our observations indicate that
these sensor measurements are closely correlated within the same tapping,
and different if obtained from different tapping events. By comparing the
similarity between the two measurements, the bank can distinguish the
Mafia fraud apart from a legitimate NFC transaction. The experimental
results and the user feedback suggest the practical feasibility of TTP. As
compared with previous sensor-based solutions, ours is the only one that
works even when the attacker and the user are in nearby locations or share
similar ambient environments. Second, we demonstrate an in-app attack based on a real world problem
in contactless payment known as the card collision or card clash. A card
collision happens when more than one card (or NFC-enabled device) are
presented to the payment terminalâs field, and the terminal does not know
which card to choose. By performing experiments, we observe that the
implementation of contactless terminals in practice matches neither EMV
nor ISO standards (the two primary standards for smart card payment)
on card collision. Based on this inconsistency, we propose âNFC Payment
Spyâ, a malicious app that tracks the userâs contactless payment transactions.
This app, running on a smart phone, simulates a card which
requests the payment information (amount, time, etc.) from the terminal.
When the phone and the card are both presented to a contactless
terminal (given that many people use mobile case wallets to travel light
and keep wallet essentials close to hand), our app can effectively win the
race condition over the card. This attack is the first privacy attack on
contactless payments based on the problem of card collision. By showing
the feasibility of this attack, we raise awareness of privacy and security
issues in contactless payment protocols and implementation, specifically
in the presence of new technologies for payment such as mobile platforms.
Third, we show that, apart from attacking mobile devices by having access
to the sensors through native apps, we can also perform sensor-based
attacks via mobile browsers. We examine multiple browsers on Android
and iOS platforms and study their policies in granting permissions to
JavaScript code with respect to access to motion and orientation sensor
data. Based on our observations, we identify multiple vulnerabilities,
and propose âTouchSignaturesâ and âPINLogger.jsâ, two novel attacks in
which malicious JavaScript code listens to such sensor data measurements.
We demonstrate that, despite the much lower sampling rate (comparing to
a native app), a remote attacker is able to learn sensitive user information
such as physical activities, phone call timing, touch actions (tap, scroll,
hold, zoom), and PINs based on these sensor data. This is the first report
of such a JavaScript-based attack. We disclosed the above vulnerability to
the community and major mobile browser vendors classified the problem
as high-risk and fixed it accordingly.
Finally, we investigate human dimensions in the problem of sensor management.
Although different types of attacks via sensors have been known for many years, the problem of data leakage caused by sensors has remained
unsolved. While working with W3C and browser vendors to fix
the identified problem, we came to appreciate the complexity of this problem
in practice and the challenge of balancing security, usability, and functionality.
We believe a major reason for this is that users are not fully
aware of these sensors and the associated risks to their privacy and security.
Therefore, we study user understanding of mobile sensors, specifically
their risk perceptions. This is the only research to date that studies risk
perceptions for a comprehensive list of mobile sensors (25 in total). We
interview multiple participants from a range of backgrounds by providing
them with multiple self-declared questionnaires. The results indicate that
people in general do not have a good understanding of the complexities
of these sensors; hence making security judgements about these sensors
is not easy for them. We discuss how this observation, along with other
factors, renders many academic and industry solutions ineffective. This
makes the security and privacy issues of mobile sensors and other sensorenabled
technologies an important topic to be investigated further
FutureWare: Designing a Middleware for Anticipatory Mobile Computing
Ubiquitous computing is moving from context-awareness to context-prediction. In order to build truly anticipatory systems
developers have to deal with many challenges, from multimodal sensing to modeling context from sensed data, and, when necessary,
coordinating multiple predictive models across devices. Novel expressive programming interfaces and paradigms are needed for this
new class of mobile and ubiquitous applications.
In this paper we present FutureWare, a middleware for seamless development of mobile applications that rely on context prediction.
FutureWare exposes an expressive API to lift the burden of mobile sensing, individual and group behavior modeling, and future context
querying, from an application developer. We implement FutureWare as an Android library, and through a scenario-based testing and a
demo app we show that it represents an efficient way of supporting anticipatory applications, reducing the necessary coding effort by
two orders of magnitude
Smittestopp â A Case Study on Digital Contact Tracing
This open access book describes Smittestopp, the first Norwegian system for digital contact tracing of Covid-19 infections, which was developed in March and early April 2020. The system was deployed after five weeks of development and was active for a little more than two months, when a drop in infection levels in Norway and privacy concerns led to shutting it down. The intention of this book is twofold. First, it reports on the design choices made in the development phase. Second, as one of the only systems in the world that collected population data into a central database and which was used for an entire population, we can share experience on how the design choices impacted the system's operation. By sharing lessons learned and the challenges faced during the development and deployment of the technology, we hope that this book can be a valuable guide for experts from different domains, such as big data collection and analysis, application development, and deployment in a national population, as well as digital tracing
UNCOVERING AND MITIGATING UNSAFE PROGRAM INTEGRATIONS IN ANDROID
Androidâs design philosophy encourages the integration of resources and functionalities from multiple parties, even with different levels of trust. Such program integrations, on one hand, connect every party in the Android ecosystem tightly on one single device. On the other hand, they can also pose severe security problems, if the security design of the underlying integration schemes is not well thought-out. This dissertation systematically evaluates the security design of three integration schemes on Android, including framework module, framework proxy and 3rd-party code embedding. With the security risks identified in each scheme, it concludes that program integrations on Android are unsafe. Furthermore, new frameworks have been designed and implemented to detect and mitigate the threats. The evaluation results on the prototypes have demonstrated their effectiveness
On the Security and Privacy Challenges in Android-based Environments
In the last decade, we have faced the rise of mobile devices as a fundamental tool in our everyday life.
Currently, there are above 6 billion smartphones, and 72% of them are Android devices.
The functionalities of smartphones are enriched by mobile apps through which users can perform operations that in the past have been made possible only on desktop/laptop computing.
Besides, users heavily rely on them for storing even the most sensitive information from a privacy point of view.
However, apps often do not satisfy all minimum security requirements and can be targeted to indirectly attack other devices managed or connected to them (e.g., IoT nodes) that may perform sensitive operations such as health checks, control a smart car or open a smart lock.
This thesis discusses some research activities carried out to enhance the security and privacy of mobile apps by i) proposing novel techniques to detect and mitigate security vulnerabilities and privacy issues, and ii) defining techniques devoted to the security evaluation of apps interacting with complex environments (e.g., mobile-IoT-Cloud).
In the first part of this thesis, I focused on the security and privacy of Mobile Apps. Due to the widespread adoption of mobile apps, it is relatively straightforward for researchers or users to quickly retrieve the app that matches their tastes, as Google provides a reliable search engine. However, it is likewise almost impossible to select apps according to a security footprint (e.g., all apps that enforce SSL pinning).
To overcome this limitation, I present APPregator, a platform that allows users to select apps according to a specific security footprint.
This tool aims to implement state-of-the-art static and dynamic analysis techniques for mobile apps and provide security researchers and analysts with a tool that makes it possible to search for mobile applications under specific functional or security requirements.
Regarding the security status of apps, I studied a particular context of mobile apps: hybrid apps composed of web technologies and native technologies (i.e., Java or Kotlin). In this context, I studied a vulnerability that affected only hybrid apps: the Frame Confusion.
This vulnerability, despite being discovered several years ago, it is still very widespread.
I proposed a methodology implemented in FCDroid that exploits static and dynamic analysis techniques to detect and trigger the vulnerability automatically.
The results of an extensive analysis carried out through FCDroid on a set of the most downloaded apps from the Google Play Store prove that 6.63% (i.e., 1637/24675) of hybrid apps are potentially vulnerable to Frame Confusion.
A side effect of the analysis I carried out through APPregator was suggesting that very few apps may have a privacy policy, despite Google Play Store imposes some strict rules about it and contained in the Google Play Privacy Guidelines.
To empirically verify if that was the case, I proposed a methodology based on the combination of static analysis, dynamic analysis, and machine learning techniques.
The proposed methodology verifies whether each app contains a privacy policy compliant with the Google Play Privacy Guidelines, and if the app accesses privacy-sensitive information only upon the acceptance of the policy by the user.
I then implemented the methodology in a tool, 3PDroid, and evaluated a number of recent and most downloaded Android apps in the Google Play Store.
Experimental results suggest that over 95% of apps access sensitive user privacy information, but only a negligible subset of it (~ 1%) fully complies with the Google Play Privacy Guidelines.
Furthermore, the obtained results have also suggested that the user privacy could be put at risk by mobile apps that keep collecting a plethora of information regarding the user's and the device behavior by relying on third-party analytics libraries.
However, collecting and using such data raised several privacy concerns, mainly because the end-user - i.e., the actual data owner - is out of the loop in this collection process. The existing privacy-enhanced solutions that emerged in the last years follow an ``all or nothing" approach, leaving to the user the sole option to accept or completely deny access to privacy-related data.
To overcome the current state-of-the-art limitations, I proposed a data anonymization methodology, called MobHide, that provides a compromise between the usefulness and privacy of the data collected and gives the user complete control over the sharing process.
For evaluating the methodology, I implemented it in a prototype called HideDroid and tested it on 4500 most-used Android apps of the Google Play Store between November 2020 and January 2021.
In the second part of this thesis, I extended privacy and security considerations outside the boundary of the single mobile device.
In particular, I focused on two scenarios.
The first is composed of an IoT device and a mobile app that have a fruitful integration to resolve and perform specific actions.
From a security standpoint, this leads to a novel and unprecedented attack surface.
To deal with such threats, applying state-of-the-art security analysis techniques on each paradigm can be insufficient.
I claimed that novel analysis methodologies able to systematically analyze the ecosystem as a whole must be put forward.
To this aim, I introduced the idea of APPIoTTe, a novel approach to the security testing of Mobile-IoT hybrid ecosystems, as well as some notes on its implementation working on Android (Mobile) and Android Things (IoT) applications.
The second scenario is composed of an IoT device widespread in the Smart Home environment: the Smart Speaker.
Smart speakers are used to retrieving information, interacting with other devices, and commanding various IoT nodes. To this aim, smart speakers typically take advantage of cloud architectures: vocal commands of the user are sampled, sent through the Internet to be processed, and transmitted back for local execution, e.g., to activate an IoT device.
Unfortunately, even if privacy and security are enforced through state-of-the-art encryption mechanisms, the features of the encrypted traffic, such as the throughput, the size of protocol data units, or the IP addresses, can leak critical information about the users' habits.
In this perspective, I showcase this kind of risk by exploiting machine learning techniques to develop black-box models to classify traffic and implement privacy leaking attacks automatically
Android security: analysis and applications
The Android mobile system is home to millions of apps that offer a wide range of functionalities. Users rely on Android apps in various facets of daily life, including critical, e.g., medical, settings. Generally, users trust that apps perform their stated purpose safely and accurately. However, despite the platformâs efforts to maintain a safe environment, apps routinely manage to evade scrutiny. This dissertation analyzes Android app behavior and has revealed several weakness: lapses in device authentication schemes, deceptive practices such as apps covering their traces, as well as behavioral and descriptive inaccuracies in medical apps. Examining a large corpus of applications has revealed that suspicious behavior is often the result of lax oversight, and can occur without an explicit intent to harm users. Nevertheless, flawed app behavior is present, and is especially problematic in apps that perform critical tasks. Additionally, manufacturerâs and app developerâs claims often do not mirror actual functionalities, e.g., as we reveal in our study of LGâs Knock Code authentication scheme, and as evidenced by the removal of Google Play medical apps due to overstated functionality claims. This dissertation makes the following contributions: (1) quantifying the security of LGâs Knock Code authentication method, (2) defining deceptive practices of self-hiding app behavior found in popular apps, (3) verifying abuses of device administrator features, (4) characterizing the medical app landscape found on Google Play, (5) detailing the claimed behaviors and conditions of medical apps using ICD codes and app descriptions, (6) verifying errors in medical score calculator app implementations, and (7) discerning how medical apps should be regulated within the jurisdiction of regulatory frameworks based on their behavior and data acquired from users
- âŠ