85 research outputs found
Quantum Entanglement Distribution in Next-Generation Wireless Communication Systems
In this work we analyze the distribution of quantum entanglement over
communication channels in the millimeter-wave regime. The motivation for such a
study is the possibility for next-generation wireless networks (beyond 5G) to
accommodate such a distribution directly - without the need to integrate
additional optical communication hardware into the transceivers. Future
wireless communication systems are bound to require some level of quantum
communications capability. We find that direct quantum-entanglement
distribution in the millimeter-wave regime is indeed possible, but that its
implementation will be very demanding from both a system-design perspective and
a channel-requirement perspective.Comment: 6 pages, 4 figure
Blind Reconciliation
Information reconciliation is a crucial procedure in the classical
post-processing of quantum key distribution (QKD). Poor reconciliation
efficiency, revealing more information than strictly needed, may compromise the
maximum attainable distance, while poor performance of the algorithm limits the
practical throughput in a QKD device. Historically, reconciliation has been
mainly done using close to minimal information disclosure but heavily
interactive procedures, like Cascade, or using less efficient but also less
interactive -just one message is exchanged- procedures, like the ones based in
low-density parity-check (LDPC) codes. The price to pay in the LDPC case is
that good efficiency is only attained for very long codes and in a very narrow
range centered around the quantum bit error rate (QBER) that the code was
designed to reconcile, thus forcing to have several codes if a broad range of
QBER needs to be catered for. Real world implementations of these methods are
thus very demanding, either on computational or communication resources or
both, to the extent that the last generation of GHz clocked QKD systems are
finding a bottleneck in the classical part. In order to produce compact, high
performance and reliable QKD systems it would be highly desirable to remove
these problems. Here we analyse the use of short-length LDPC codes in the
information reconciliation context using a low interactivity, blind, protocol
that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits
length LDPC codes are suitable for blind reconciliation. Such codes are of high
interest in practice, since they can be used for hardware implementations with
very high throughput.Comment: 22 pages, 8 figure
A QKD Protocol Extendable to Support Entanglement and Reduce Unauthorized Information Gain by Randomizing the Bases Lists with Key Values and Invalidate Explicit Privacy Amplification
This paper suggests an improvement to the BB84 scheme in Quantum key
distribution. The original scheme has its weakness in letting quantifiably more
information gain to an eavesdropper during public announcement of unencrypted
bases lists. The security of the secret key comes at the expense of the final
key length. We aim at exploiting the randomness of preparation (measurement)
basis and the bit values encoded (observed), so as to randomize the bases lists
before they are communicated over the public channel. A proof of security is
given for our scheme and proven that our protocol results in lesser information
gain by Eve in comparison with BB84 and its other extensions. Moreover, an
analysis is made on the feasibility of our proposal as such and to support
entanglement based QKD. The performance of our protocol is compared in terms of
the upper and lower bounds on the tolerable bit error rate. We also quantify
the information gain (by Eve) mathematically using the familiar approach of the
concept of Shannon entropy. The paper models the attack by Eve in terms of
interference in a multi-access quantum channel. Besides, this paper also hints
at the invalidation of a separate privacy amplification step in the
"prepare-and-measure" protocols in general.Comment: 13 pages, 1 figure, submitted for review to the USENIX 200
A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics
From the output produced by a memoryless deletion channel from a uniformly
random input of known length , one obtains a posterior distribution on the
channel input. The difference between the Shannon entropy of this distribution
and that of the uniform prior measures the amount of information about the
channel input which is conveyed by the output of length , and it is natural
to ask for which outputs this is extremized. This question was posed in a
previous work, where it was conjectured on the basis of experimental data that
the entropy of the posterior is minimized and maximized by the constant strings
and and the alternating strings
and respectively. In the present
work we confirm the minimization conjecture in the asymptotic limit using
results from hidden word statistics. We show how the analytic-combinatorial
methods of Flajolet, Szpankowski and Vall\'ee for dealing with the hidden
pattern matching problem can be applied to resolve the case of fixed output
length and , by obtaining estimates for the entropy in
terms of the moments of the posterior distribution and establishing its
minimization via a measure of autocorrelation.Comment: 11 pages, 2 figure
Quantum Communication, Sensing and Measurement in Space
The main theme of the conclusions drawn for classical communication systems
operating at optical or higher frequencies is that there is a wellâunderstood
performance gain in photon efficiency (bits/photon) and spectral efficiency
(bits/s/Hz) by pursuing coherentâstate transmitters (classical ideal laser light)
coupled with novel quantum receiver systems operating near the Holevo limit (e.g.,
joint detection receivers). However, recent research indicates that these receivers
will require nonlinear and nonclassical optical processes and components at the
receiver. Consequently, the implementation complexity of Holevoâcapacityapproaching
receivers is not yet fully ascertained. Nonetheless, because the
potential gain is significant (e.g., the projected photon efficiency and data rate of
MIT Lincoln Laboratory's Lunar Lasercom Demonstration (LLCD) could be achieved
with a factorâofâ20 reduction in the modulation bandwidth requirement), focused
research activities on groundâreceiver architectures that approach the Holevo limit
in spaceâcommunication links would be beneficial.
The potential gains resulting from quantumâenhanced sensing systems in space
applications have not been laid out as concretely as some of the other areas
addressed in our study. In particular, while the study period has produced several
interesting highârisk and highâpayoff avenues of research, more detailed seedlinglevel
investigations are required to fully delineate the potential return relative to
the stateâofâtheâart. Two prominent examples are (1) improvements to pointing,
acquisition and tracking systems (e.g., for optical communication systems) by way
of quantum measurements, and (2) possible weakâvalued measurement techniques
to attain highâaccuracy sensing systems for in situ or remoteâsensing instruments.
While these concepts are technically sound and have very promising benchâtop
demonstrations in a lab environment, they are not mature enough to realistically
evaluate their performance in a spaceâbased application. Therefore, it is
recommended that future work follow small focused efforts towards incorporating
practical constraints imposed by a space environment.
The space platform has been well recognized as a nearly ideal environment for some
of the most precise tests of fundamental physics, and the ensuing potential of
scientific advances enabled by quantum technologies is evident in our report. For
example, an exciting concept that has emerged for gravityâwave detection is that the
intermediate frequency band spanning 0.01 to 10 Hzâwhich is inaccessible from
the groundâcould be accessed at unprecedented sensitivity with a spaceâbased
interferometer that uses shorter arms relative to stateâofâtheâart to keep the
diffraction losses low, and employs frequencyâdependent squeezed light to surpass
the standard quantum limit sensitivity. This offers the potential to open up a new
window into the universe, revealing the behavior of compact astrophysical objects
and pulsars. As another set of examples, research accomplishments in the atomic
and optics fields in recent years have ushered in a number of novel clocks and
sensors that can achieve unprecedented measurement precisions. These emerging
technologies promise new possibilities in fundamental physics, examples of which
are tests of relativistic gravity theory, universality of free fall, frameâdragging
precession, the gravitational inverseâsquare law at micron scale, and new ways of gravitational wave detection with atomic inertial sensors. While the relevant
technologies and their discovery potentials have been well demonstrated on the
ground, there exists a large gap to spaceâbased systems. To bridge this gap and to
advance fundamentalâphysics exploration in space, focused investments that further
mature promising technologies, such as spaceâbased atomic clocks and quantum
sensors based on atomâwave interferometers, are recommended.
Bringing a group of experts from diverse technical backgrounds together in a
productive interactive environment spurred some unanticipated innovative
concepts. One promising concept is the possibility of utilizing a spaceâbased
interferometer as a frequency reference for terrestrial precision measurements.
Spaceâbased gravitational wave detectors depend on extraordinarily low noise in
the separation between spacecraft, resulting in an ultraâstable frequency reference
that is several orders of magnitude better than the state of the art of frequency
references using terrestrial technology. The next steps in developing this promising
new concept are simulations and measurement of atmospheric effects that may limit
performance due to nonâreciprocal phase fluctuations.
In summary, this report covers a broad spectrum of possible new opportunities in
space science, as well as enhancements in the performance of communication and
sensing technologies, based on observing, manipulating and exploiting the
quantumâmechanical nature of our universe. In our study we identified a range of
exciting new opportunities to capture the revolutionary capabilities resulting from
quantum enhancements. We believe that pursuing these opportunities has the
potential to positively impact the NASA mission in both the near term and in the
long term. In this report we lay out the research and development paths that we
believe are necessary to realize these opportunities and capitalize on the gains
quantum technologies can offer
An Analysis of Error Reconciliation Protocols for use in Quantum Key Distribution
Quantum Key Distribution (QKD) is a method for transmitting a cryptographic key between a sender and receiver in a theoretically unconditionally secure way. Unfortunately, the present state of technology prohibits the flawless quantum transmission required to make QKD a reality. For this reason, error reconciliation protocols have been developed which preserve security while allowing a sender and receiver to reconcile the errors in their respective keys. The most famous of these protocols is Brassard and Salvail\u27s Cascade, which is effective, but suffers from a high communication complexity and therefore results in low throughput. Another popular option is Buttler\u27s Winnow protocol, which reduces the communication complexity over Cascade, but has the added detriment of introducing errors, and has been shown to be less effective than Cascade. Finally, Gallager\u27s Low Density Parity Check (LDPC) codes have recently been shown to reconcile errors at rates higher than those of Cascade and Winnow with a large reduction in communication, but with greater computational complexity. This research seeks to evaluate the effectiveness of these LDPC codes in a QKD setting, while comparing real-world parameters such as runtime, throughput and communication complexity empirically with the well-known Cascade and Winnow algorithms. Additionally, the effects of inaccurate error estimation, non-uniform error distribution and varying key length on all three protocols are evaluated for identical input key strings. Analyses are performed on the results in order to characterize the performance of all three protocols and determine the strengths and weaknesses of each
Cybersecurity and Quantum Computing: friends or foes?
L'abstract eÌ presente nell'allegato / the abstract is in the attachmen
- âŠ