1,683 research outputs found
Security of two-way quantum key distribution
Quantum key distribution protocols typically make use of a one-way quantum
channel to distribute a shared secret string to two distant users. However,
protocols exploiting a two-way quantum channel have been proposed as an
alternative route to the same goal, with the potential advantage of
outperforming one-way protocols. Here we provide a strategy to prove security
for two-way quantum key distribution protocols against the most general quantum
attack possible by an eavesdropper. We utilize an entropic uncertainty
relation, and only a few assumptions need to be made about the devices used in
the protocol. We also show that a two-way protocol can outperform comparable
one-way protocols.Comment: 10 pages, 5 figure
Assumptions in Quantum Cryptography
Quantum cryptography uses techniques and ideas from physics and computer
science. The combination of these ideas makes the security proofs of quantum
cryptography a complicated task. To prove that a quantum-cryptography protocol
is secure, assumptions are made about the protocol and its devices. If these
assumptions are not justified in an implementation then an eavesdropper may
break the security of the protocol. Therefore, security is crucially dependent
on which assumptions are made and how justified the assumptions are in an
implementation of the protocol.
This thesis is primarily a review that analyzes and clarifies the connection
between the security proofs of quantum-cryptography protocols and their
experimental implementations. In particular, we focus on quantum key
distribution: the task of distributing a secret random key between two parties.
We provide a comprehensive introduction to several concepts: quantum mechanics
using the density operator formalism, quantum cryptography, and quantum key
distribution. We define security for quantum key distribution and outline
several mathematical techniques that can either be used to prove security or
simplify security proofs. In addition, we analyze the assumptions made in
quantum cryptography and how they may or may not be justified in
implementations.
Along with the review, we propose a framework that decomposes
quantum-key-distribution protocols and their assumptions into several classes.
Protocol classes can be used to clarify which proof techniques apply to which
kinds of protocols. Assumption classes can be used to specify which assumptions
are justified in implementations and which could be exploited by an
eavesdropper. Two contributions of the author are discussed: the security
proofs of two two-way quantum-key-distribution protocols and an intuitive proof
of the data-processing inequality.Comment: PhD Thesis, 221 page
The origin of the E+ transition in GaAsN alloys
Optical properties of GaAsN system with nitrogen concentrations in the range
of 0.9-3.7% are studied by full-potential LAPW method in a supercell approach.
The E+ transition is identified by calculating the imaginary part of the
dielectric function. The evolution of the energy of this transition with
nitrogen concentration is studied and the origin of this transition is
identified by analyzing the contributions to the dielectric function from
different band combinations. The L_1c-derived states are shown to play an
important role in the formation of the E+ transition, which was also suggested
by recent experiments. At the same time the nitrogen-induced modification of
the first conduction band of the host compound are also found to contribute
significantly to the E+ transition. Further, the study of several model
supercells demonstrated the significant influence of the nitrogen potential on
the optical properties of the GaAsN system.Comment: 5 pages, 3 figure
Entanglement verification with realistic measurement devices via squashing operations
Many protocols and experiments in quantum information science are described
in terms of simple measurements on qubits. However, in a real implementation,
the exact description is more difficult, and more complicated observables are
used. The question arises whether a claim of entanglement in the simplified
description still holds, if the difference between the realistic and simplified
models is taken into account. We show that a positive entanglement statement
remains valid if a certain positive linear map connecting the two
descriptions--a so-called squashing operation--exists; then lower bounds on the
amount of entanglement are also possible. We apply our results to polarization
measurements of photons using only threshold detectors, and derive procedures
under which multi-photon events can be neglected.Comment: 12 pages, 2 figure
Squashing Models for Optical Measurements in Quantum Communication
Measurements with photodetectors necessarily need to be described in the
infinite dimensional Fock space of one or several modes. For some measurements
a model has been postulated which describes the full mode measurement as a
composition of a mapping (squashing) of the signal into a small dimensional
Hilbert space followed by a specified target measurement. We present a
formalism to investigate whether a given measurement pair of mode and target
measurements can be connected by a squashing model. We show that the
measurements used in the BB84 protocol do allow a squashing description,
although the six-state protocol does not. As a result, security proofs for the
BB84 protocol can be based on the assumption that the eavesdropper forwards at
most one photon, while the same does not hold for the six-state protocol.Comment: 4 pages, 2 figures. Fixed a typographical error. Replaced the
six-state protocol counter-example. Conclusions of the paper are unchange
An intuitive proof of the data processing inequality
The data processing inequality (DPI) is a fundamental feature of information
theory. Informally it states that you cannot increase the information content
of a quantum system by acting on it with a local physical operation. When the
smooth min-entropy is used as the relevant information measure, then the DPI
follows immediately from the definition of the entropy. The DPI for the von
Neumann entropy is then obtained by specializing the DPI for the smooth
min-entropy by using the quantum asymptotic equipartition property (QAEP). We
provide a new, simplified proof of the QAEP and therefore obtain a
self-contained proof of the DPI for the von Neumann entropy.Comment: 10 page
Motor Vehicle Accidents: The Most Common Cause of Traumatic Vertebrobasilar Ischemia
Background: Recent media exposure of strokes from chiropractic manipulation have focused attention on traumatic vertebrobasilar ischemia. However, chiropractic manipulation, while the easiest cause to recognize, is probably not the most common cause of this condition. Methods: We reviewed all consecutive cases of traumatic vertebrobasilar ischemia referred to a single neurovascular practice over 20 years, from the office files and hospital records. Results: There were 80 patients whose vertebrobasilar ischemia was attributed to neck trauma. Five were diagnosed as due to chiropractic manipulation, but the commonest attributed cause was motor vehicle accidents (MVAs), which accounted for 70 cases; one was a sports injury, and five were industrial accidents. In some cases neck pain from an MVA led to chiropractic manipulation, so the cause may have been compounded. In most vehicular cases the diagnosis had been missed, even denied, by the neurologists and neurosurgeons initially involved. The longest delay between the injury and the onset of delayed symptoms was five years. Conclusions: Traumatic vertebrobasilar ischemia is most often due to MVAs; the diagnosis is often missed, in part because of the delay between injury and onset of symptoms and, in part, we hypothesize, because of reluctance of doctors to be involved in medicolegal cases
Recommended from our members
Correcting for differential recruitment in respondent-driven sampling data using ego-network information
Respondent-Driven sampling (RDS) is a sampling method devised to overcome challenges with sampling hard-to-reach human populations. The sampling starts with a limited number of individuals who are asked to recruit a small number of their contacts. Every surveyed individual is subsequently given the same opportunity to recruit additional members of the target population until a pre-established sample size is achieved. The recruitment process consequently implies that the survey respondents are responsible for deciding who enters the study. Most RDS prevalence estimators assume that participants select among their contacts completely at random. The main objective of this work is to correct the inference for departure from this assumption, such as systematic recruitment based on the characteristics of the individuals or based on the nature of relationships. To accomplish this, we introduce three forms of non-random recruitment, provide estimators for these recruitment behaviors and extend three estimators and their associated variance procedures. The proposed methodology is assessed through a simulation study capturing various sampling and network features. Finally, the proposed methods are applied to a public health setting
- …