605 research outputs found
Security and privacy aspects of mobile applications for post-surgical care
Mobile technologies have the potential to improve patient monitoring, medical decision making and in general the efficiency and quality of health delivery. They also pose new security and privacy challenges. The objectives of this work are to (i) Explore and define security and privacy requirements on the example of a post-surgical care application, and (ii) Develop and test a pilot implementation Post-Surgical Care Studies of surgical out- comes indicate that timely treatment of the most common complications in compliance with established post-surgical regiments greatly improve success rates. The goal of our pilot application is to enable physician to optimally synthesize and apply patient directed best medical practices to prevent post-operative complications in an individualized patient/procedure specific fashion. We propose a framework for a secure protocol to enable doctors to check most common complications for their patient during in-hospital post- surgical care. We also implemented our construction and cryptographic protocols as an iPhone application on the iOS using existing cryptographic services and libraries
Can Two Walk Together: Privacy Enhancing Methods and Preventing Tracking of Users
We present a new concern when collecting data from individuals that arises
from the attempt to mitigate privacy leakage in multiple reporting: tracking of
users participating in the data collection via the mechanisms added to provide
privacy. We present several definitions for untrackable mechanisms, inspired by
the differential privacy framework.
Specifically, we define the trackable parameter as the log of the maximum
ratio between the probability that a set of reports originated from a single
user and the probability that the same set of reports originated from two users
(with the same private value). We explore the implications of this new
definition. We show how differentially private and untrackable mechanisms can
be combined to achieve a bound for the problem of detecting when a certain user
changed their private value.
Examining Google's deployed solution for everlasting privacy, we show that
RAPPOR (Erlingsson et al. ACM CCS, 2014) is trackable in our framework for the
parameters presented in their paper.
We analyze a variant of randomized response for collecting statistics of
single bits, Bitwise Everlasting Privacy, that achieves good accuracy and
everlasting privacy, while only being reasonably untrackable, specifically
grows linearly in the number of reports. For collecting statistics about data
from larger domains (for histograms and heavy hitters) we present a mechanism
that prevents tracking for a limited number of responses.
We also present the concept of Mechanism Chaining, using the output of one
mechanism as the input of another, in the scope of Differential Privacy, and
show that the chaining of an -LDP mechanism with an
-LDP mechanism is
-LDP
and that this bound is tight.Comment: 45 pages, 4 figures. To appear on FORC 202
- …