1,086 research outputs found

    Analysing and reducing the limitations of continuous-variable quantum cryptography and quantum networks

    Get PDF
    Due to a recent influx of attention, the field of quantum information is rapidly progressing towards the point at which quantum technologies move from the laboratory to widespread community use. However, several difficulties must be overcome before this milestone can be achieved. Two such difficulties are addressed in this thesis. The first is the ever-growing security threat posed by quantum computers to existing cryptographic protocols and the second is the missing knowledge regarding the performance differences between quantum and classical communications over various existing network topologies. Continuous-variable (CV) quantum key distribution (QKD) poses a practical solution to the security risks implied by the advancement of quantum information theory, with the promise of provably secure communications. Unfortunately, the maximum range of many CV-QKD protocols is limited. Here, this limitation is addressed by the application of post-selection, firstly, to a scenario in which two parties communicate using terahertz frequency radiation in the atmosphere, and secondly, to measurement-device-independent QKD, in which two parties communicate through the medium of an untrusted relay. In both cases, the introduction of post-selection enables security over distances substantially exceeding those of equivalent existing protocols. The second difficulty is addressed by a comparison of the quantum and classical networking regimes of the butterfly network and a group of networks constructed with butterfly blocks. By computing the achievable classical rates and upper bounds for quantum communication, the performance difference between the two regimes is quantified, and a range of conditions is established under which classical networking outperforms its quantum counterpart. This allows for guidance to be provided on which network structures should be avoided when constructing a quantum internet

    Two-Server Oblivious Transfer for Quantum Messages

    Full text link
    Oblivious transfer is considered as a cryptographic primitive task for quantum information processing over quantum network. Although it is possible with two servers, any existing protocol works only with classical messages. We propose two-server oblivious transfer protocols for quantum messages

    Introductory Chapter: Quantum Computing and Communications

    Get PDF

    Sharing Classical Secrets with Continuous-Variable Entanglement: Composable Security and Network Coding Advantage

    Get PDF
    Secret sharing is a multiparty cryptographic primitive that can be applied to a network of partially distrustful parties for encrypting data that is both sensitive (it must remain secure) and important (it must not be lost or destroyed). When sharing classical secrets (as opposed to quantum states), one can distinguish between protocols that leverage bipartite quantum key distribution (QKD) and those that exploit multipartite entanglement. The latter class are known to be vulnerable to so-called participant attacks and, while progress has been made recently, there is currently no analysis that quantifies their performance in the composable, finite-size regime, which has become the gold standard for QKD security. Given this—and the fact that distributing multipartite entanglement is typically challenging—one might well ask is there any virtue in pursuing multipartite entanglement-based schemes? Here, we answer this question in the affirmative for a class of secret-sharing protocols based on continuous-variable graph states. We establish security in a composable framework and identify a network topology, specifically a bottleneck network of lossy channels, and parameter regimes within the reach of present-day experiments for which a multipartite scheme outperforms the corresponding QKD-based method in the asymptotic and finite-size setting. Finally, we establish experimental parameters where the multipartite schemes outperform any possible QKD-based protocol. This is one of the first concrete compelling examples of multipartite entangled resources achieving a genuine advantage over point-to-point protocols for quantum communication and represents a rigorous, operational benchmark to assess the usefulness of such resources

    A Tutorial on Clique Problems in Communications and Signal Processing

    Full text link
    Since its first use by Euler on the problem of the seven bridges of K\"onigsberg, graph theory has shown excellent abilities in solving and unveiling the properties of multiple discrete optimization problems. The study of the structure of some integer programs reveals equivalence with graph theory problems making a large body of the literature readily available for solving and characterizing the complexity of these problems. This tutorial presents a framework for utilizing a particular graph theory problem, known as the clique problem, for solving communications and signal processing problems. In particular, the paper aims to illustrate the structural properties of integer programs that can be formulated as clique problems through multiple examples in communications and signal processing. To that end, the first part of the tutorial provides various optimal and heuristic solutions for the maximum clique, maximum weight clique, and kk-clique problems. The tutorial, further, illustrates the use of the clique formulation through numerous contemporary examples in communications and signal processing, mainly in maximum access for non-orthogonal multiple access networks, throughput maximization using index and instantly decodable network coding, collision-free radio frequency identification networks, and resource allocation in cloud-radio access networks. Finally, the tutorial sheds light on the recent advances of such applications, and provides technical insights on ways of dealing with mixed discrete-continuous optimization problems

    On entanglement spreading from holography

    Full text link
    A global quench is an interesting setting where we can study thermalization of subsystems in a pure state. We investigate entanglement entropy (EE) growth in global quenches in holographic field theories and relate some of its aspects to quantities characterizing chaos. More specifically we obtain four key results: 1. We prove holographic bounds on the entanglement velocity vEv_E and the butterfly effect speed vBv_B that arises in the study of chaos. 2. We obtain the EE as a function of time for large spherical entangling surfaces analytically. We show that the EE is insensitive to the details of the initial state or quench protocol. 3. In a thermofield double state we determine analytically the two-sided mutual information between two large concentric spheres separated in time. 4. We derive a bound on the rate of growth of EE for arbitrary shapes, and develop an expansion for EE at early times. In a companion paper arXiv:1608.05101, we put these results in the broader context of EE growth in chaotic systems: we relate EE growth to the chaotic spreading of operators, derive bounds on EE at a given time, and compare the holographic results to spin chain numerics and toy models. In this paper, we perform holographic calculations that provide the basis of arguments presented in that paper.Comment: v2: presentation improved, typos fixed, 54 pages, 17 figures v1: 53 pages, 16 figure

    Quantum Computing and Communications

    Get PDF
    This book explains the concepts and basic mathematics of quantum computing and communication. Chapters cover such topics as quantum algorithms, photonic implementations of discrete-time quantum walks, how to build a quantum computer, and quantum key distribution and teleportation, among others

    Optimization of a new digital image compression algorithm based on nonlinear dynamical systems

    Get PDF
    In this paper we discuss the formulation, research and development of an optimization process for a new compression algorithm known as DYNAMAC, which has its basis in the nonlinear systems theory. We establish that by increasing the measure of randomness of the signal, the peak signal to noise ratio and in turn the quality of compression can be improved to a great extent. This measure, entropy, through exhaustive testing, will be linked to peak signal to noise ratio (PSNR, a measure of quality) and by various discussions and inferences we will establish that this measure would independently drive the compression process towards optimization. We will also introduce an Adaptive Huffman Algorithm to add to the compression ratio of the current algorithm without incurring any losses during transmission (Huffman being a lossless scheme)
    • …
    corecore