53,853 research outputs found

    The Serums Tool-Chain:Ensuring Security and Privacy of Medical Data in Smart Patient-Centric Healthcare Systems

    Get PDF
    Digital technology is permeating all aspects of human society and life. This leads to humans becoming highly dependent on digital devices, including upon digital: assistance, intelligence, and decisions. A major concern of this digital dependence is the lack of human oversight or intervention in many of the ways humans use this technology. This dependence and reliance on digital technology raises concerns in how humans trust such systems, and how to ensure digital technology behaves appropriately. This works considers recent developments and projects that combine digital technology and artificial intelligence with human society. The focus is on critical scenarios where failure of digital technology can lead to significant harm or even death. We explore how to build trust for users of digital technology in such scenarios and considering many different challenges for digital technology. The approaches applied and proposed here address user trust along many dimensions and aim to build collaborative and empowering use of digital technologies in critical aspects of human society

    Advanced Probabilistic Couplings for Differential Privacy

    Get PDF
    Differential privacy is a promising formal approach to data privacy, which provides a quantitative bound on the privacy cost of an algorithm that operates on sensitive information. Several tools have been developed for the formal verification of differentially private algorithms, including program logics and type systems. However, these tools do not capture fundamental techniques that have emerged in recent years, and cannot be used for reasoning about cutting-edge differentially private algorithms. Existing techniques fail to handle three broad classes of algorithms: 1) algorithms where privacy depends accuracy guarantees, 2) algorithms that are analyzed with the advanced composition theorem, which shows slower growth in the privacy cost, 3) algorithms that interactively accept adaptive inputs. We address these limitations with a new formalism extending apRHL, a relational program logic that has been used for proving differential privacy of non-interactive algorithms, and incorporating aHL, a (non-relational) program logic for accuracy properties. We illustrate our approach through a single running example, which exemplifies the three classes of algorithms and explores new variants of the Sparse Vector technique, a well-studied algorithm from the privacy literature. We implement our logic in EasyCrypt, and formally verify privacy. We also introduce a novel coupling technique called \emph{optimal subset coupling} that may be of independent interest

    Privacy Games: Optimal User-Centric Data Obfuscation

    Full text link
    In this paper, we design user-centric obfuscation mechanisms that impose the minimum utility loss for guaranteeing user's privacy. We optimize utility subject to a joint guarantee of differential privacy (indistinguishability) and distortion privacy (inference error). This double shield of protection limits the information leakage through obfuscation mechanism as well as the posterior inference. We show that the privacy achieved through joint differential-distortion mechanisms against optimal attacks is as large as the maximum privacy that can be achieved by either of these mechanisms separately. Their utility cost is also not larger than what either of the differential or distortion mechanisms imposes. We model the optimization problem as a leader-follower game between the designer of obfuscation mechanism and the potential adversary, and design adaptive mechanisms that anticipate and protect against optimal inference algorithms. Thus, the obfuscation mechanism is optimal against any inference algorithm
    corecore