5,924 research outputs found

    Contributions to Lattice–based Cryptography

    Get PDF
    Post–quantum cryptography (PQC) is a new and fast–growing part of Cryptography. It focuses on developing cryptographic algorithms and protocols that resist quantum adversaries (i.e., the adversaries who have access to quantum computers). To construct a new PQC primitive, a designer must use a mathematical problem intractable for the quantum adversary. Many intractability assumptions are being used in PQC. There seems to be a consensus in the research community that the most promising are intractable/hard problems in lattices. However, lattice–based cryptography still needs more research to make it more efficient and practical. The thesis contributes toward achieving either the novelty or the practicality of lattice– based cryptographic systems

    On the limits of engine analysis for cheating detection in chess

    Get PDF
    The integrity of online games has important economic consequences for both the gaming industry and players of all levels, from professionals to amateurs. Where there is a high likelihood of cheating, there is a loss of trust and players will be reluctant to participate — particularly if this is likely to cost them money. Chess is a game that has been established online for around 25 years and is played over the Internet commercially. In that environment, where players are not physically present “over the board” (OTB), chess is one of the most easily exploitable games by those who wish to cheat, because of the widespread availability of very strong chess-playing programs. Allegations of cheating even in OTB games have increased significantly in recent years, and even led to recent changes in the laws of the game that potentially impinge upon players’ privacy. In this work, we examine some of the difficulties inherent in identifying the covert use of chess-playing programs purely from an analysis of the moves of a game. Our approach is to deeply examine a large collection of games where there is confidence that cheating has not taken place, and analyse those that could be easily misclassified. We conclude that there is a serious risk of finding numerous “false positives” and that, in general, it is unsafe to use just the moves of a single game as prima facie evidence of cheating. We also demonstrate that it is impossible to compute definitive values of the figures currently employed to measure similarity to a chess-engine for a particular game, as values inevitably vary at different depths and, even under identical conditions, when multi-threading evaluation is used

    Lightweight Techniques for Private Heavy Hitters

    Full text link
    This paper presents a new protocol for solving the private heavy-hitters problem. In this problem, there are many clients and a small set of data-collection servers. Each client holds a private bitstring. The servers want to recover the set of all popular strings, without learning anything else about any client's string. A web-browser vendor, for instance, can use our protocol to figure out which homepages are popular, without learning any user's homepage. We also consider the simpler private subset-histogram problem, in which the servers want to count how many clients hold strings in a particular set without revealing this set to the clients. Our protocols use two data-collection servers and, in a protocol run, each client send sends only a single message to the servers. Our protocols protect client privacy against arbitrary misbehavior by one of the servers and our approach requires no public-key cryptography (except for secure channels), nor general-purpose multiparty computation. Instead, we rely on incremental distributed point functions, a new cryptographic tool that allows a client to succinctly secret-share the labels on the nodes of an exponentially large binary tree, provided that the tree has a single non-zero path. Along the way, we develop new general tools for providing malicious security in applications of distributed point functions. In an experimental evaluation with two servers on opposite sides of the U.S., the servers can find the 200 most popular strings among a set of 400,000 client-held 256-bit strings in 54 minutes. Our protocols are highly parallelizable. We estimate that with 20 physical machines per logical server, our protocols could compute heavy hitters over ten million clients in just over one hour of computation.Comment: To appear in IEEE Security & Privacy 202

    Building Machines That Learn and Think Like People

    Get PDF
    Recent progress in artificial intelligence (AI) has renewed interest in building systems that learn and think like people. Many advances have come from using deep neural networks trained end-to-end in tasks such as object recognition, video games, and board games, achieving performance that equals or even beats humans in some respects. Despite their biological inspiration and performance achievements, these systems differ from human intelligence in crucial ways. We review progress in cognitive science suggesting that truly human-like learning and thinking machines will have to reach beyond current engineering trends in both what they learn, and how they learn it. Specifically, we argue that these machines should (a) build causal models of the world that support explanation and understanding, rather than merely solving pattern recognition problems; (b) ground learning in intuitive theories of physics and psychology, to support and enrich the knowledge that is learned; and (c) harness compositionality and learning-to-learn to rapidly acquire and generalize knowledge to new tasks and situations. We suggest concrete challenges and promising routes towards these goals that can combine the strengths of recent neural network advances with more structured cognitive models.Comment: In press at Behavioral and Brain Sciences. Open call for commentary proposals (until Nov. 22, 2016). https://www.cambridge.org/core/journals/behavioral-and-brain-sciences/information/calls-for-commentary/open-calls-for-commentar

    IST Austria Thesis

    Get PDF
    Many security definitions come in two flavors: a stronger “adaptive” flavor, where the adversary can arbitrarily make various choices during the course of the attack, and a weaker “selective” flavor where the adversary must commit to some or all of their choices a-priori. For example, in the context of identity-based encryption, selective security requires the adversary to decide on the identity of the attacked party at the very beginning of the game whereas adaptive security allows the attacker to first see the master public key and some secret keys before making this choice. Often, it appears to be much easier to achieve selective security than it is to achieve adaptive security. A series of several recent works shows how to cleverly achieve adaptive security in several such scenarios including generalized selective decryption [Pan07][FJP15], constrained PRFs [FKPR14], and Yao’s garbled circuits [JW16]. Although the above works expressed vague intuition that they share a common technique, the connection was never made precise. In this work we present a new framework (published at Crypto ’17 [JKK+17a]) that connects all of these works and allows us to present them in a unified and simplified fashion. Having the framework in place, we show how to achieve adaptive security for proxy re-encryption schemes (published at PKC ’19 [FKKP19]) and provide the first adaptive security proofs for continuous group key agreement protocols (published at S&P ’21 [KPW+21]). Questioning optimality of our framework, we then show that currently used proof techniques cannot lead to significantly better security guarantees for "graph-building" games (published at TCC ’21 [KKPW21a]). These games cover generalized selective decryption, as well as the security of prominent constructions for constrained PRFs, continuous group key agreement, and proxy re-encryption. Finally, we revisit the adaptive security of Yao’s garbled circuits and extend the analysis of Jafargholi and Wichs in two directions: While they prove adaptive security only for a modified construction with increased online complexity, we provide the first positive results for the original construction by Yao (published at TCC ’21 [KKP21a]). On the negative side, we prove that the results of Jafargholi and Wichs are essentially optimal by showing that no black-box reduction can provide a significantly better security bound (published at Crypto ’21 [KKPW21c])

    Hiding secrets in public random functions

    Full text link
    Constructing advanced cryptographic applications often requires the ability of privately embedding messages or functions in the code of a program. As an example, consider the task of building a searchable encryption scheme, which allows the users to search over the encrypted data and learn nothing other than the search result. Such a task is achievable if it is possible to embed the secret key of an encryption scheme into the code of a program that performs the "decrypt-then-search" functionality, and guarantee that the code hides everything except its functionality. This thesis studies two cryptographic primitives that facilitate the capability of hiding secrets in the program of random functions. 1. We first study the notion of a private constrained pseudorandom function (PCPRF). A PCPRF allows the PRF master secret key holder to derive a public constrained key that changes the functionality of the original key without revealing the constraint description. Such a notion closely captures the goal of privately embedding functions in the code of a random function. Our main contribution is in constructing single-key secure PCPRFs for NC^1 circuit constraints based on the learning with errors assumption. Single-key secure PCPRFs were known to support a wide range of cryptographic applications, such as private-key deniable encryption and watermarking. In addition, we build reusable garbled circuits from PCPRFs. 2. We then study how to construct cryptographic hash functions that satisfy strong random oracle-like properties. In particular, we focus on the notion of correlation intractability, which requires that given the description of a function, it should be hard to find an input-output pair that satisfies any sparse relations. Correlation intractability captures the security properties required for, e.g., the soundness of the Fiat-Shamir heuristic, where the Fiat-Shamir transformation is a practical method of building signature schemes from interactive proof protocols. However, correlation intractability was shown to be impossible to achieve for certain length parameters, and was widely considered to be unobtainable. Our contribution is in building correlation intractable functions from various cryptographic assumptions. The security analyses of the constructions use the techniques of secretly embedding constraints in the code of random functions

    Simpler Constructions of Asymmetric Primitives from Obfuscation

    Get PDF
    We revisit constructions of asymmetric primitives from obfuscation and give simpler alternatives. We consider public-key encryption, (hierarchical) identity-based encryption ((H)IBE), and predicate encryption. Obfuscation has already been shown to imply PKE by Sahai and Waters (STOC\u2714) and full-fledged functional encryption by Garg et al. (FOCS\u2713). We simplify all these constructions and reduce the necessary assumptions on the class of circuits that the obfuscator needs to support. Our PKE scheme relies on just a PRG and does not need any puncturing. Our IBE and bounded HIBE schemes convert natural key-delegation mechanisms from (recursive) applications of puncturable PRFs to IBE and HIBE schemes. Our most technical contribution is an unbounded HIBE, which uses (public-coin) differing-inputs obfuscation for circuits and whose proof relies on a recent pebbling-based hybrid argument by Fuchsbauer et al. (ASIACRYPT\u2714). All our constructions are anonymous, support arbitrary inputs, and have compact keys and ciphertexts

    Private Puncturable PRFs From Standard Lattice Assumptions

    Get PDF
    A puncturable pseudorandom function (PRF) has a master key kk that enables one to evaluate the PRF at all points of the domain, and has a punctured key kxk_x that enables one to evaluate the PRF at all points but one. The punctured key kxk_x reveals no information about the value of the PRF at the punctured point xx. Punctured PRFs play an important role in cryptography, especially in applications of indistinguishability obfuscation. However, in previous constructions, the punctured key kxk_x completely reveals the punctured point xx: given kxk_x it is easy to determine xx. A {\em private} puncturable PRF is one where kxk_x reveals nothing about~xx. This concept was defined by Boneh, Lewi, and Wu, who showed the usefulness of private puncturing, and gave constructions based on multilinear maps. The question is whether private puncturing can be built from a standard (weaker) cryptographic assumption. We construct the first privately puncturable PRF from standard lattice assumptions, namely from the hardness of learning with errors (LWE) and 1 dimensional short integer solutions (1D-SIS), which have connections to worst-case hardness of general lattice problems. Our starting point is the (non-private) PRF of Brakerski and Vaikuntanathan. We introduce a number of new techniques to enhance this PRF, from which we obtain a privately puncturable PRF. In addition, we also study the simulation based definition of private constrained PRFs for general circuits, and show that the definition is not satisfiable

    Programmable Distributed Point Functions

    Get PDF
    A distributed point function (DPF) is a cryptographic primitive that enables compressed additive sharing of a secret unit vector across two or more parties. Despite growing ubiquity within applications and notable research efforts, the best 2-party DPF construction to date remains the tree-based construction from (Boyle et al, CCS\u2716), with no significantly new approaches since. We present a new framework for 2-party DPF construction, which applies in the setting of feasible (polynomial-size) domains. This captures in particular all DPF applications in which the keys are expanded to the full domain. Our approach is motivated by a strengthened notion we put forth, of programmable DPF (PDPF): in which a short, input-independent offline key can be reused for sharing many point functions. * PDPF from OWF: We construct a PDPF for feasible domains from the minimal assumption that one-way functions exist, where the second online key size is polylogarithmic in the domain size NN. Our approach offers multiple new efficiency features and applications: * Privately puncturable PRFs: Our PDPF gives the first OWF-based privately puncturable PRFs (for feasible domains) with sublinear keys. * O(1)O(1)-round distributed DPF Gen: We obtain a (standard) DPF with polylog-size keys that admits an analog of Doerner-shelat (CCS\u2717) distributed key generation, requiring only O(1)O(1) rounds (versus log⁥N\log N). * PCG with 1 short key: Compressing useful correlations for secure computation, where one key is of minimal size. This provides up to exponential communication savings in some application scenarios
    • …
    corecore