68 research outputs found

    How to Run Turing Machines on Encrypted Data

    Get PDF
    Algorithms for computing on encrypted data promise to be a fundamental building block of cryptography. The way one models such algorithms has a crucial effect on the efficiency and usefulness of the resulting cryptographic schemes. As of today, almost all known schemes for fully homomorphic encryption, functional encryption, and garbling schemes work by modeling algorithms as circuits rather than as Turing machines. As a consequence of this modeling, evaluating an algorithm over encrypted data is as slow as the worst-case running time of that algorithm, a dire fact for many tasks. In addition, in settings where an evaluator needs a description of the algorithm itself in some encoded form, the cost of computing and communicating such encoding is as large as the worst-case running time of this algorithm. In this work, we construct cryptographic schemes for computing Turing machines on encrypted data that avoid the worst-case problem. Specifically, we show: – An attribute-based encryption scheme for any polynomial-time Turing machine and Random Access Machine (RAM). – A (single-key and succinct) functional encryption scheme for any polynomial-time Turing machine. – A reusable garbling scheme for any polynomial-time Turing machine. These three schemes have the property that the size of a key or of a garbling for a Turing machine is very short: it depends only on the description of the Turing machine and not on its running time. Previously, the only existing constructions of such schemes were for depth-d circuits, where all the parameters grow with d. Our constructions remove this depth d restriction, have short keys, and moreover, avoid the worst-case running time. – A variant of fully homomorphic encryption scheme for Turing machines, where one can evaluate a Turing machine M on an encrypted input x in time that is dependent on the running time of M on input x as opposed to the worst-case runtime of M. Previously, such a result was known only for a restricted class of Turing machines and it required an expensive preprocessing phase (with worst-case runtime); our constructions remove both restrictions. Our results are obtained via a reduction from SNARKs (Bitanski et al) and an extractable variant of witness encryption, a scheme introduced by Garg et al.. We prove that the new assumption is secure in the generic group model. We also point out the connection between (the variant of) witness encryption and the obfuscation of point filter functions as defined by Goldwasser and Kalai in 2005

    Dynamic consent: a possible solution to improve patient confidence and trust in how electronic patient records are used in medical research

    Get PDF
    With one million people treated every 36 hours, routinely collected UK National Health Service (NHS) health data has huge potential for medical research. Advances in data acquisition from electronic patient records (EPRs) means such data are increasingly digital and can be anonymised for research purposes. NHS England’s care.data initiative recently sought to increase the amount and availability of such data. However, controversy and uncertainty following the care.data public awareness campaign led to a delay in rollout, indicating that the success of EPR data for medical research may be threatened by a loss of patient and public trust. The sharing of sensitive health care data can only be done through maintaining such trust in a constantly evolving ethicolegal and political landscape. We propose that a dynamic consent model, whereby patients can electronically control consent through time and receive information about the uses of their data, provides a transparent, flexible, and user-friendly means to maintain public trust. This could leverage the huge potential of the EPR for medical research and, ultimately, patient and societal benefit

    Dynamic consent: a possible solution to improve patient confidence and trust in how electronic patient records are used in medical research

    Get PDF
    With one million people treated every 36 hours, routinely collected UK National Health Service (NHS) health data has huge potential for medical research. Advances in data acquisition from electronic patient records (EPRs) means such data are increasingly digital and can be anonymised for research purposes. NHS England’s care.data initiative recently sought to increase the amount and availability of such data. However, controversy and uncertainty following the care.data public awareness campaign led to a delay in rollout, indicating that the success of EPR data for medical research may be threatened by a loss of patient and public trust. The sharing of sensitive health care data can only be done through maintaining such trust in a constantly evolving ethicolegal and political landscape. We propose that a dynamic consent model, whereby patients can electronically control consent through time and receive information about the uses of their data, provides a transparent, flexible, and user-friendly means to maintain public trust. This could leverage the huge potential of the EPR for medical research and, ultimately, patient and societal benefit

    Reusable Garbled Deterministic Finite Automata from Learning With Errors

    Get PDF

    Pre-Constrained Encryption

    Get PDF
    In all existing encryption systems, the owner of the master secret key has the ability to decrypt all ciphertexts. In this work, we propose a new notion of pre-constrained encryption (PCE) where the owner of the master secret key does not have "full" decryption power. Instead, its decryption power is constrained in a pre-specified manner during the system setup. We present formal definitions and constructions of PCE, and discuss societal applications and implications to some well-studied cryptographic primitives

    Interaction-Preserving Compilers for Secure Computation

    Get PDF
    In this work we consider the following question: What is the cost of security for multi-party protocols? Specifically, given an insecure protocol where parties exchange (in the worst case) ? bits in N rounds, is it possible to design a secure protocol with communication complexity close to ? and N rounds? We systematically study this problem in a variety of settings and we propose solutions based on the intractability of different cryptographic problems. For the case of two parties we design an interaction-preserving compiler where the number of bits exchanged in the secure protocol approaches ? and the number of rounds is exactly N, assuming the hardness of standard problems over lattices. For the more general multi-party case, we obtain the same result assuming either (i) an additional round of interaction or (ii) the existence of extractable witness encryption and succinct non-interactive arguments of knowledge. As a contribution of independent interest, we construct the first multi-key fully homomorphic encryption scheme with message-to-ciphertext ratio (i.e., rate) of 1 - o(1), assuming the hardness of the learning with errors (LWE) problem. We view our work as a support for the claim that, as far as interaction and communication are concerned, one does not need to pay a significant price for security in multi-party protocols

    Predictable arguments of knowledge

    Get PDF
    We initiate a formal investigation on the power of predictability for argument of knowledge systems for NP. Specifically, we consider private-coin argument systems where the answer of the prover can be predicted, given the private randomness of the verifier; we call such protocols Predictable Arguments of Knowledge (PAoK). Our study encompasses a full characterization of PAoK, showing that such arguments can be made extremely laconic, with the prover sending a single bit, and assumed to have only one round (i.e., two messages) of communication without loss of generality. We additionally explore PAoK satisfying additional properties (including zero-knowledge and the possibility of re-using the same challenge across multiple executions with the prover), present several constructions of PAoK relying on different cryptographic tools, and discuss applications to cryptography

    Tokenized Ecosystem of Personal Data - Exemplified on the Context of the Smart City

    Get PDF
    Data driven businesses, services, and even smart cities of tomorrow depend on access to data not only from machines, but also personal data of consumers, clients, citizens. Sustain-able utilization of such data must base on legal compliancy, ethical soundness, and consent. Data subjects nowadays largely lack empowerment over utilization and monetization of their personal data. To change this, we propose a tokenized ecosystem of personal data (TokPD), combining anonymization, referencing, encryption, decentralization, and functional layering to establish a privacy preserving solution for processing of personal data. This tokenized ecosys-tem is a more generalized variant of the smart city ecosystem described in the preceding publi-cation "Smart Cities of Self-Determined Data Subjects" (Frecè & Selzam 2017) with focus to-wards further options of decentralization. We use the example of a smart city to demonstrate, how TokPD ensures the data subjects’ privacy, grants the smart city access to a high number of new data sources, and simultaneously handles the user-consent to ensure compliance with mod-ern data protection regulation
    • …
    corecore