53 research outputs found

    Minimum distance of error correcting codes versus encoding complexity, symmetry, and pseudorandomness

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2003.Includes bibliographical references (leaves 207-214).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.We study the minimum distance of binary error correcting codes from the following perspectives: * The problem of deriving bounds on the minimum distance of a code given constraints on the computational complexity of its encoder. * The minimum distance of linear codes that are symmetric in the sense of being invariant under the action of a group on the bits of the codewords. * The derandomization capabilities of probability measures on the Hamming cube based on binary linear codes with good distance properties, and their variations. Highlights of our results include: * A general theorem that asserts that if the encoder uses linear time and sub-linear memory in the general binary branching program model, then the minimum distance of the code cannot grow linearly with the block length when the rate is nonvanishing. * New upper bounds on the minimum distance of various types of Turbo-like codes. * The first ensemble of asymptotically good Turbo like codes. We prove that depth-three serially concatenated Turbo codes can be asymptotically good. * The first ensemble of asymptotically good codes that are ideals in the group algebra of a group. We argue that, for infinitely many block lengths, a random ideal in the group algebra of the dihedral group is an asymptotically good rate half code with a high probability. * An explicit rate-half code whose codewords are in one-to-one correspondence with special hyperelliptic curves over a finite field of prime order where the number of zeros of a codeword corresponds to the number of rational points.(cont.) * A sharp O(k-1/2) upper bound on the probability that a random binary string generated according to a k-wise independent probability measure has any given weight. * An assertion saying that any sufficiently log-wise independent probability measure looks random to all polynomially small read-once DNF formulas. * An elaborate study of the problem of derandomizability of ACâ‚€ by any sufficiently polylog-wise independent probability measure. * An elaborate study of the problem of approximability of high-degree parity functions on binary linear codes by low-degree polynomials with coefficients in fields of odd characteristics.by Louay M.J. Bazzi.Ph.D

    Quantum Information at High and Low Energies

    Get PDF
    In this thesis, we take a look at how quantum information theory can be used to study physical systems at both high and low energies. In the first part of this thesis, we examine the structure of the low-energy subspaces of quantum many-body systems. We show that the existence of error-correcting properties in low-energy subspaces is a generic feature of quantum systems. Using the formalism of matrix product states, we construct explicit quantum error-detecting codes formed from the momentum eigenstates of a quantum many-body system. We also examine how topological order can persist past the ground state space into the low-energy subspace of excited states by studying the No Low-Energy Trivial States (NLTS) conjecture. We prove a version of the NLTS conjecture under the assumption of symmetry protection. Moreover, we show that our symmetric NLTS result has implications for the performance of quantum variational optimization algorithms by using it to prove a bound on the Quantum Approximate Optimization Algorithm (QAOA). In the second part of this thesis, we examine problems related to bulk reconstruction in holography and the black hole firewall paradox. Using the formalism of the tensor Radon transform, we devise and implement a numerical algorithm for reconstructing (perturbatively in AdS₃/CFT₂) the bulk metric tensor from a given boundary entropy profile. We finally examine the black hole firewall problem from the perspective of quantum error-correction and quantum computational complexity. We argue that the state of the Hawking radiation has the special property of being computationally pseudorandom, meaning that it cannot be distinguished from the maximally mixed state by any efficient quantum computation. We show that this implies that each black hole has a natural structure as a quantum error-correcting code.</p

    Decryption Failure Attacks on Post-Quantum Cryptography

    Get PDF
    This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results

    Uncertainty relations for multiple measurements with applications

    Full text link
    Uncertainty relations express the fundamental incompatibility of certain observables in quantum mechanics. Far from just being puzzling constraints on our ability to know the state of a quantum system, uncertainty relations are at the heart of why some classically impossible cryptographic primitives become possible when quantum communication is allowed. This thesis is concerned with strong notions of uncertainty relations and their applications in quantum information theory. One operational manifestation of such uncertainty relations is a purely quantum effect referred to as information locking. A locking scheme can be viewed as a cryptographic protocol in which a uniformly random n-bit message is encoded in a quantum system using a classical key of size much smaller than n. Without the key, no measurement of this quantum state can extract more than a negligible amount of information about the message, in which case the message is said to be "locked". Furthermore, knowing the key, it is possible to recover, that is "unlock", the message. We give new efficient constructions of bases satisfying strong uncertainty relations leading to the first explicit construction of an information locking scheme. We also give several other applications of our uncertainty relations both to cryptographic and communication tasks. In addition, we define objects called QC-extractors, that can be seen as strong uncertainty relations that hold against quantum adversaries. We provide several constructions of QC-extractors, and use them to prove the security of cryptographic protocols for two-party computations based on the sole assumption that the parties' storage device is limited in transmitting quantum information. In doing so, we resolve a central question in the so-called noisy-storage model by relating security to the quantum capacity of storage devices.Comment: PhD Thesis, McGill University, School of Computer Science, 158 pages. Contains arXiv:1010.3007 and arXiv:1111.2026 with some small addition
    • …
    corecore