17,228 research outputs found
Shortest Path Computation with No Information Leakage
Shortest path computation is one of the most common queries in location-based
services (LBSs). Although particularly useful, such queries raise serious
privacy concerns. Exposing to a (potentially untrusted) LBS the client's
position and her destination may reveal personal information, such as social
habits, health condition, shopping preferences, lifestyle choices, etc. The
only existing method for privacy-preserving shortest path computation follows
the obfuscation paradigm; it prevents the LBS from inferring the source and
destination of the query with a probability higher than a threshold. This
implies, however, that the LBS still deduces some information (albeit not
exact) about the client's location and her destination. In this paper we aim at
strong privacy, where the adversary learns nothing about the shortest path
query. We achieve this via established private information retrieval
techniques, which we treat as black-box building blocks. Experiments on real,
large-scale road networks assess the practicality of our schemes.Comment: VLDB201
An Empirical Study on Android for Saving Non-shared Data on Public Storage
With millions of apps that can be downloaded from official or third-party
market, Android has become one of the most popular mobile platforms today.
These apps help people in all kinds of ways and thus have access to lots of
user's data that in general fall into three categories: sensitive data, data to
be shared with other apps, and non-sensitive data not to be shared with others.
For the first and second type of data, Android has provided very good storage
models: an app's private sensitive data are saved to its private folder that
can only be access by the app itself, and the data to be shared are saved to
public storage (either the external SD card or the emulated SD card area on
internal FLASH memory). But for the last type, i.e., an app's non-sensitive and
non-shared data, there is a big problem in Android's current storage model
which essentially encourages an app to save its non-sensitive data to shared
public storage that can be accessed by other apps. At first glance, it seems no
problem to do so, as those data are non-sensitive after all, but it implicitly
assumes that app developers could correctly identify all sensitive data and
prevent all possible information leakage from private-but-non-sensitive data.
In this paper, we will demonstrate that this is an invalid assumption with a
thorough survey on information leaks of those apps that had followed Android's
recommended storage model for non-sensitive data. Our studies showed that
highly sensitive information from billions of users can be easily hacked by
exploiting the mentioned problematic storage model. Although our empirical
studies are based on a limited set of apps, the identified problems are never
isolated or accidental bugs of those apps being investigated. On the contrary,
the problem is rooted from the vulnerable storage model recommended by Android.
To mitigate the threat, we also propose a defense framework
Inferring Person-to-person Proximity Using WiFi Signals
Today's societies are enveloped in an ever-growing telecommunication
infrastructure. This infrastructure offers important opportunities for sensing
and recording a multitude of human behaviors. Human mobility patterns are a
prominent example of such a behavior which has been studied based on cell phone
towers, Bluetooth beacons, and WiFi networks as proxies for location. However,
while mobility is an important aspect of human behavior, understanding complex
social systems requires studying not only the movement of individuals, but also
their interactions. Sensing social interactions on a large scale is a technical
challenge and many commonly used approaches---including RFID badges or
Bluetooth scanning---offer only limited scalability. Here we show that it is
possible, in a scalable and robust way, to accurately infer person-to-person
physical proximity from the lists of WiFi access points measured by smartphones
carried by the two individuals. Based on a longitudinal dataset of
approximately 800 participants with ground-truth interactions collected over a
year, we show that our model performs better than the current state-of-the-art.
Our results demonstrate the value of WiFi signals in social sensing as well as
potential threats to privacy that they imply
User-centric Privacy Engineering for the Internet of Things
User privacy concerns are widely regarded as a key obstacle to the success of
modern smart cyber-physical systems. In this paper, we analyse, through an
example, some of the requirements that future data collection architectures of
these systems should implement to provide effective privacy protection for
users. Then, we give an example of how these requirements can be implemented in
a smart home scenario. Our example architecture allows the user to balance the
privacy risks with the potential benefits and take a practical decision
determining the extent of the sharing. Based on this example architecture, we
identify a number of challenges that must be addressed by future data
processing systems in order to achieve effective privacy management for smart
cyber-physical systems.Comment: 12 Page
- …