4,686 research outputs found

    Exponential Lower Bound for 2-Query Locally Decodable Codes via a Quantum Argument

    Get PDF
    A locally decodable code encodes n-bit strings x in m-bit codewords C(x), in such a way that one can recover any bit x_i from a corrupted codeword by querying only a few bits of that word. We use a quantum argument to prove that LDCs with 2 classical queries need exponential length: m=2^{Omega(n)}. Previously this was known only for linear codes (Goldreich et al. 02). Our proof shows that a 2-query LDC can be decoded with only 1 quantum query, and then proves an exponential lower bound for such 1-query locally quantum-decodable codes. We also show that q quantum queries allow more succinct LDCs than the best known LDCs with q classical queries. Finally, we give new classical lower bounds and quantum upper bounds for the setting of private information retrieval. In particular, we exhibit a quantum 2-server PIR scheme with O(n^{3/10}) qubits of communication, improving upon the O(n^{1/3}) bits of communication of the best known classical 2-server PIR.Comment: 16 pages Latex. 2nd version: title changed, large parts rewritten, some results added or improve

    Constructions of Batch Codes via Finite Geometry

    Full text link
    A primitive kk-batch code encodes a string xx of length nn into string yy of length NN, such that each multiset of kk symbols from xx has kk mutually disjoint recovering sets from yy. We develop new explicit and random coding constructions of linear primitive batch codes based on finite geometry. In some parameter regimes, our proposed codes have lower redundancy than previously known batch codes.Comment: 7 pages, 1 figure, 1 tabl

    Servicing the federation : the case for metadata harvesting

    Get PDF
    The paper presents a comparative analysis of data harvesting and distributed computing as complementary models of service delivery within large-scale federated digital libraries. Informed by requirements of flexibility and scalability of federated services, the analysis focuses on the identification and assessment of model invariants. In particular, it abstracts over application domains, services, and protocol implementations. The analytical evidence produced shows that the harvesting model offers stronger guarantees of satisfying the identified requirements. In addition, it suggests a first characterisation of services based on their suitability to either model and thus indicates how they could be integrated in the context of a single federated digital library

    On Improving Communication Complexity in Cryptography

    Get PDF
    Cryptography grew to be much more than "the study of secret writing". Modern cryptography is concerned with establishing properties such as privacy, integrity and authenticity in protocols for secure communication and computation. This comes at a price: Cryptographic tools usually introduce an overhead, both in terms of communication complexity (that is, number and size of messages transmitted) and computational efficiency (that is, time and memory required). As in many settings communication between the parties involved is the bottleneck, this thesis is concerned with improving communication complexity in cryptographic protocols. One direction towards this goal is scalable cryptography: In many cryptographic schemes currently deployed, the security degrades linearly with the number of instances (e.g. encrypted messages) in the system. As this number can be huge in contexts like cloud computing, the parameters of the scheme have to be chosen considerably larger - and in particular depending on the expected number of instances in the system - to maintain security guarantees. We advance the state-of-the-art regarding scalable cryptography by constructing schemes where the security guarantees are independent of the number of instances. This allows to choose smaller parameters, even when the expected number of instances is immense. - We construct the first scalable encryption scheme with security against active adversaries which has both compact public keys and ciphertexts. In particular, we significantly reduce the size of the public key to only about 3% of the key-size of the previously most efficient scalable encryption scheme. (Gay,Hofheinz, and Kohl, CRYPTO, 2017) - We present a scalable structure-preserving signature scheme which improves both in terms of public-key and signature size compared to the previously best construction to about 40% and 56% of the sizes, respectively. (Gay, Hofheinz, Kohl, and Pan, EUROCRYPT, 2018) Another important area of cryptography is secure multi-party computation, where the goal is to jointly evaluate some function while keeping each party’s input private. In traditional approaches towards secure multi-party computation either the communication complexity scales linearly in the size of the function, or the computational efficiency is poor. To overcome this issue, Boyle, Gilboa, and Ishai (CRYPTO, 2016) introduced the notion of homomorphic secret sharing. Here, inputs are shared between parties such that each party does not learn anything about the input, and such that the parties can locally evaluate functions on the shares. Homomorphic secret sharing implies secure computation where the communication complexity only depends on the size of the inputs, which is typically much smaller than the size of the function. A different approach towards efficient secure computation is to split the protocol into an input-independent preprocessing phase, where long correlated strings are generated, and a very efficient online phase. One example for a useful correlation are authenticated Beaver triples, which allow to perform efficient multiplications in the online phase such that privacy of the inputs is preserved and parties deviating the protocol can be detected. The currently most efficient protocols implementing the preprocessing phase require communication linear in the number of triples to be generated. This results typically in high communication costs, as the online phase requires at least one authenticated Beaver triple per multiplication. We advance the state-of-the art regarding efficient protocols for secure computation with low communication complexity as follows. - We construct the first homomorphic secret sharing scheme for computing arbitrary functions in NC 1 (that is, functions that are computably by circuits with logarithmic depth) which supports message spaces of arbitrary size, has only negligible correctness error, and does not require expensive multiplication on ciphertexts. (Boyle, Kohl, and Scholl, EUROCRYPT, 2019) - We introduce the notion of a pseudorandom correlation generator for general correlations. Pseudorandom correlation generators allow to locally extend short correlated seeds into long pseudorandom correlated strings. We show that pseudorandom correlation generators can replace the preprocessing phase in many protocols, leading to a preprocessing phase with sublinear communication complexity. We show connections to homomorphic secret sharing schemes and give the first instantiation of pseudorandom correlation generators for authenticated Beaver triples at reasonable computational efficiency. (Boyle, Couteau, Gilboa, Ishai, Kohl, and Scholl, CRYPTO, 2019

    Complexity Theory

    Get PDF
    Computational Complexity Theory is the mathematical study of the intrinsic power and limitations of computational resources like time, space, or randomness. The current workshop focused on recent developments in various sub-areas including arithmetic complexity, Boolean complexity, communication complexity, cryptography, probabilistic proof systems, pseudorandomness and randomness extraction. Many of the developments are related to diverse mathematical fields such as algebraic geometry, combinatorial number theory, probability theory, representation theory, and the theory of error-correcting codes
    corecore