85 research outputs found

    Quantum Entanglement Distribution in Next-Generation Wireless Communication Systems

    Full text link
    In this work we analyze the distribution of quantum entanglement over communication channels in the millimeter-wave regime. The motivation for such a study is the possibility for next-generation wireless networks (beyond 5G) to accommodate such a distribution directly - without the need to integrate additional optical communication hardware into the transceivers. Future wireless communication systems are bound to require some level of quantum communications capability. We find that direct quantum-entanglement distribution in the millimeter-wave regime is indeed possible, but that its implementation will be very demanding from both a system-design perspective and a channel-requirement perspective.Comment: 6 pages, 4 figure

    Blind Reconciliation

    Get PDF
    Information reconciliation is a crucial procedure in the classical post-processing of quantum key distribution (QKD). Poor reconciliation efficiency, revealing more information than strictly needed, may compromise the maximum attainable distance, while poor performance of the algorithm limits the practical throughput in a QKD device. Historically, reconciliation has been mainly done using close to minimal information disclosure but heavily interactive procedures, like Cascade, or using less efficient but also less interactive -just one message is exchanged- procedures, like the ones based in low-density parity-check (LDPC) codes. The price to pay in the LDPC case is that good efficiency is only attained for very long codes and in a very narrow range centered around the quantum bit error rate (QBER) that the code was designed to reconcile, thus forcing to have several codes if a broad range of QBER needs to be catered for. Real world implementations of these methods are thus very demanding, either on computational or communication resources or both, to the extent that the last generation of GHz clocked QKD systems are finding a bottleneck in the classical part. In order to produce compact, high performance and reliable QKD systems it would be highly desirable to remove these problems. Here we analyse the use of short-length LDPC codes in the information reconciliation context using a low interactivity, blind, protocol that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits length LDPC codes are suitable for blind reconciliation. Such codes are of high interest in practice, since they can be used for hardware implementations with very high throughput.Comment: 22 pages, 8 figure

    A QKD Protocol Extendable to Support Entanglement and Reduce Unauthorized Information Gain by Randomizing the Bases Lists with Key Values and Invalidate Explicit Privacy Amplification

    Get PDF
    This paper suggests an improvement to the BB84 scheme in Quantum key distribution. The original scheme has its weakness in letting quantifiably more information gain to an eavesdropper during public announcement of unencrypted bases lists. The security of the secret key comes at the expense of the final key length. We aim at exploiting the randomness of preparation (measurement) basis and the bit values encoded (observed), so as to randomize the bases lists before they are communicated over the public channel. A proof of security is given for our scheme and proven that our protocol results in lesser information gain by Eve in comparison with BB84 and its other extensions. Moreover, an analysis is made on the feasibility of our proposal as such and to support entanglement based QKD. The performance of our protocol is compared in terms of the upper and lower bounds on the tolerable bit error rate. We also quantify the information gain (by Eve) mathematically using the familiar approach of the concept of Shannon entropy. The paper models the attack by Eve in terms of interference in a multi-access quantum channel. Besides, this paper also hints at the invalidation of a separate privacy amplification step in the "prepare-and-measure" protocols in general.Comment: 13 pages, 1 figure, submitted for review to the USENIX 200

    A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics

    Get PDF
    From the output produced by a memoryless deletion channel from a uniformly random input of known length nn, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length mm, and it is natural to ask for which outputs this is extremized. This question was posed in a previous work, where it was conjectured on the basis of experimental data that the entropy of the posterior is minimized and maximized by the constant strings 000
\texttt{000}\ldots and 111
\texttt{111}\ldots and the alternating strings 0101
\texttt{0101}\ldots and 1010
\texttt{1010}\ldots respectively. In the present work we confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics. We show how the analytic-combinatorial methods of Flajolet, Szpankowski and Vall\'ee for dealing with the hidden pattern matching problem can be applied to resolve the case of fixed output length and n→∞n\rightarrow\infty, by obtaining estimates for the entropy in terms of the moments of the posterior distribution and establishing its minimization via a measure of autocorrelation.Comment: 11 pages, 2 figure

    Quantum Communication, Sensing and Measurement in Space

    Get PDF
    The main theme of the conclusions drawn for classical communication systems operating at optical or higher frequencies is that there is a well‐understood performance gain in photon efficiency (bits/photon) and spectral efficiency (bits/s/Hz) by pursuing coherent‐state transmitters (classical ideal laser light) coupled with novel quantum receiver systems operating near the Holevo limit (e.g., joint detection receivers). However, recent research indicates that these receivers will require nonlinear and nonclassical optical processes and components at the receiver. Consequently, the implementation complexity of Holevo‐capacityapproaching receivers is not yet fully ascertained. Nonetheless, because the potential gain is significant (e.g., the projected photon efficiency and data rate of MIT Lincoln Laboratory's Lunar Lasercom Demonstration (LLCD) could be achieved with a factor‐of‐20 reduction in the modulation bandwidth requirement), focused research activities on ground‐receiver architectures that approach the Holevo limit in space‐communication links would be beneficial. The potential gains resulting from quantum‐enhanced sensing systems in space applications have not been laid out as concretely as some of the other areas addressed in our study. In particular, while the study period has produced several interesting high‐risk and high‐payoff avenues of research, more detailed seedlinglevel investigations are required to fully delineate the potential return relative to the state‐of‐the‐art. Two prominent examples are (1) improvements to pointing, acquisition and tracking systems (e.g., for optical communication systems) by way of quantum measurements, and (2) possible weak‐valued measurement techniques to attain high‐accuracy sensing systems for in situ or remote‐sensing instruments. While these concepts are technically sound and have very promising bench‐top demonstrations in a lab environment, they are not mature enough to realistically evaluate their performance in a space‐based application. Therefore, it is recommended that future work follow small focused efforts towards incorporating practical constraints imposed by a space environment. The space platform has been well recognized as a nearly ideal environment for some of the most precise tests of fundamental physics, and the ensuing potential of scientific advances enabled by quantum technologies is evident in our report. For example, an exciting concept that has emerged for gravity‐wave detection is that the intermediate frequency band spanning 0.01 to 10 Hz—which is inaccessible from the ground—could be accessed at unprecedented sensitivity with a space‐based interferometer that uses shorter arms relative to state‐of‐the‐art to keep the diffraction losses low, and employs frequency‐dependent squeezed light to surpass the standard quantum limit sensitivity. This offers the potential to open up a new window into the universe, revealing the behavior of compact astrophysical objects and pulsars. As another set of examples, research accomplishments in the atomic and optics fields in recent years have ushered in a number of novel clocks and sensors that can achieve unprecedented measurement precisions. These emerging technologies promise new possibilities in fundamental physics, examples of which are tests of relativistic gravity theory, universality of free fall, frame‐dragging precession, the gravitational inverse‐square law at micron scale, and new ways of gravitational wave detection with atomic inertial sensors. While the relevant technologies and their discovery potentials have been well demonstrated on the ground, there exists a large gap to space‐based systems. To bridge this gap and to advance fundamental‐physics exploration in space, focused investments that further mature promising technologies, such as space‐based atomic clocks and quantum sensors based on atom‐wave interferometers, are recommended. Bringing a group of experts from diverse technical backgrounds together in a productive interactive environment spurred some unanticipated innovative concepts. One promising concept is the possibility of utilizing a space‐based interferometer as a frequency reference for terrestrial precision measurements. Space‐based gravitational wave detectors depend on extraordinarily low noise in the separation between spacecraft, resulting in an ultra‐stable frequency reference that is several orders of magnitude better than the state of the art of frequency references using terrestrial technology. The next steps in developing this promising new concept are simulations and measurement of atmospheric effects that may limit performance due to non‐reciprocal phase fluctuations. In summary, this report covers a broad spectrum of possible new opportunities in space science, as well as enhancements in the performance of communication and sensing technologies, based on observing, manipulating and exploiting the quantum‐mechanical nature of our universe. In our study we identified a range of exciting new opportunities to capture the revolutionary capabilities resulting from quantum enhancements. We believe that pursuing these opportunities has the potential to positively impact the NASA mission in both the near term and in the long term. In this report we lay out the research and development paths that we believe are necessary to realize these opportunities and capitalize on the gains quantum technologies can offer

    An Analysis of Error Reconciliation Protocols for use in Quantum Key Distribution

    Get PDF
    Quantum Key Distribution (QKD) is a method for transmitting a cryptographic key between a sender and receiver in a theoretically unconditionally secure way. Unfortunately, the present state of technology prohibits the flawless quantum transmission required to make QKD a reality. For this reason, error reconciliation protocols have been developed which preserve security while allowing a sender and receiver to reconcile the errors in their respective keys. The most famous of these protocols is Brassard and Salvail\u27s Cascade, which is effective, but suffers from a high communication complexity and therefore results in low throughput. Another popular option is Buttler\u27s Winnow protocol, which reduces the communication complexity over Cascade, but has the added detriment of introducing errors, and has been shown to be less effective than Cascade. Finally, Gallager\u27s Low Density Parity Check (LDPC) codes have recently been shown to reconcile errors at rates higher than those of Cascade and Winnow with a large reduction in communication, but with greater computational complexity. This research seeks to evaluate the effectiveness of these LDPC codes in a QKD setting, while comparing real-world parameters such as runtime, throughput and communication complexity empirically with the well-known Cascade and Winnow algorithms. Additionally, the effects of inaccurate error estimation, non-uniform error distribution and varying key length on all three protocols are evaluated for identical input key strings. Analyses are performed on the results in order to characterize the performance of all three protocols and determine the strengths and weaknesses of each

    Cybersecurity and Quantum Computing: friends or foes?

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen
    • 

    corecore