235 research outputs found
Protecting Quantum Information with Entanglement and Noisy Optical Modes
We incorporate active and passive quantum error-correcting techniques to
protect a set of optical information modes of a continuous-variable quantum
information system. Our method uses ancilla modes, entangled modes, and gauge
modes (modes in a mixed state) to help correct errors on a set of information
modes. A linear-optical encoding circuit consisting of offline squeezers,
passive optical devices, feedforward control, conditional modulation, and
homodyne measurements performs the encoding. The result is that we extend the
entanglement-assisted operator stabilizer formalism for discrete variables to
continuous-variable quantum information processing.Comment: 7 pages, 1 figur
Quantum Convolutional Coding with Shared Entanglement: General Structure
We present a general theory of entanglement-assisted quantum convolutional
coding. The codes have a convolutional or memory structure, they assume that
the sender and receiver share noiseless entanglement prior to quantum
communication, and they are not restricted to possess the
Calderbank-Shor-Steane structure as in previous work. We provide two
significant advances for quantum convolutional coding theory. We first show how
to "expand" a given set of quantum convolutional generators. This expansion
step acts as a preprocessor for a polynomial symplectic Gram-Schmidt
orthogonalization procedure that simplifies the commutation relations of the
expanded generators to be the same as those of entangled Bell states (ebits)
and ancilla qubits. The above two steps produce a set of generators with
equivalent error-correcting properties to those of the original generators. We
then demonstrate how to perform online encoding and decoding for a stream of
information qubits, halves of ebits, and ancilla qubits. The upshot of our
theory is that the quantum code designer can engineer quantum convolutional
codes with desirable error-correcting properties without having to worry about
the commutation relations of these generators.Comment: 23 pages, replaced with final published versio
Addressing the clumsiness loophole in a Leggett-Garg test of macrorealism
The rise of quantum information theory has lent new relevance to experimental
tests for non-classicality, particularly in controversial cases such as
adiabatic quantum computing superconducting circuits. The Leggett-Garg
inequality is a "Bell inequality in time" designed to indicate whether a single
quantum system behaves in a macrorealistic fashion. Unfortunately, a violation
of the inequality can only show that the system is either (i)
non-macrorealistic or (ii) macrorealistic but subjected to a measurement
technique that happens to disturb the system. The "clumsiness" loophole (ii)
provides reliable refuge for the stubborn macrorealist, who can invoke it to
brand recent experimental and theoretical work on the Leggett-Garg test
inconclusive. Here, we present a revised Leggett-Garg protocol that permits one
to conclude that a system is either (i) non-macrorealistic or (ii)
macrorealistic but with the property that two seemingly non-invasive
measurements can somehow collude and strongly disturb the system. By providing
an explicit check of the invasiveness of the measurements, the protocol
replaces the clumsiness loophole with a significantly smaller "collusion"
loophole.Comment: 7 pages, 3 figure
Applicability of LCA tool for building materials produced from construction and demolition waste : case of Tanzania
It is estimated that about 10 million tonnes of construction and demolition (C&D) waste is generated annually in Tanzania. This waste is expected to increase even more because of population increases, urbanization, industrialization and commercialization which results in more utilization of natural resources as well. The stock of material resources (raw materials) decrease worldwide and any excessive material extraction puts pressure on natural resources, including ecosystems which depend on this resource for survival. This justifies the research to find a proper technology for recovery (reusing, recycling and upcycling) of C&D wastes which would alleviate the excessive extraction and utilization of natural resources. The research aims at solutions to use the C&D waste to produce building materials (for example concrete blocks) which is a commonly used building material in Tanzania. To ensure the sustainability of such building materials, the environmental, social and economic parameters have to be assessed by using a Life Cycle Assessment (LCA) tool. Life cycle assessment is a technique for assessing the environmental impacts associated with a product over its life cycle. Currently, most of Life Cycle Assessment tools applied are developed in developed countries such as SimaPro (Netherlands), INVEST (United Kingdom), BEES (United States) etc. Their direct applicability is limited due to the fact that they are developed in countries with different environmental conditions as well as economic status. Therefore, this research paper discusses the identification of an appropriate Life Cycle Assessment tool which can be applied in research to determine the extent to which building products resulting from construction and demolition waste are sustainable in ecological sense
Precise evaluation of leaked information with universal2 privacy amplification in the presence of quantum attacker
We treat secret key extraction when the eavesdropper has correlated quantum
states. We propose quantum privacy amplification theorems different from
Renner's, which are based on quantum conditional R\'{e}nyi entropy of order
1+s. Using those theorems, we derive an exponential decreasing rate for leaked
information and the asymptotic equivocation rate, which have not been derived
hitherto in the quantum setting
Anisotropic flow of charged hadrons, pions and (anti-)protons measured at high transverse momentum in Pb-Pb collisions at TeV
The elliptic, , triangular, , and quadrangular, , azimuthal
anisotropic flow coefficients are measured for unidentified charged particles,
pions and (anti-)protons in Pb-Pb collisions at TeV
with the ALICE detector at the Large Hadron Collider. Results obtained with the
event plane and four-particle cumulant methods are reported for the
pseudo-rapidity range at different collision centralities and as a
function of transverse momentum, , out to GeV/.
The observed non-zero elliptic and triangular flow depends only weakly on
transverse momentum for GeV/. The small dependence
of the difference between elliptic flow results obtained from the event plane
and four-particle cumulant methods suggests a common origin of flow
fluctuations up to GeV/. The magnitude of the (anti-)proton
elliptic and triangular flow is larger than that of pions out to at least
GeV/ indicating that the particle type dependence persists out
to high .Comment: 16 pages, 5 captioned figures, authors from page 11, published
version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/186
Centrality dependence of charged particle production at large transverse momentum in Pb-Pb collisions at TeV
The inclusive transverse momentum () distributions of primary
charged particles are measured in the pseudo-rapidity range as a
function of event centrality in Pb-Pb collisions at
TeV with ALICE at the LHC. The data are presented in the range
GeV/ for nine centrality intervals from 70-80% to 0-5%.
The Pb-Pb spectra are presented in terms of the nuclear modification factor
using a pp reference spectrum measured at the same collision
energy. We observe that the suppression of high- particles strongly
depends on event centrality. In central collisions (0-5%) the yield is most
suppressed with at -7 GeV/. Above
GeV/, there is a significant rise in the nuclear modification
factor, which reaches for GeV/. In
peripheral collisions (70-80%), the suppression is weaker with almost independently of . The measured nuclear
modification factors are compared to other measurements and model calculations.Comment: 17 pages, 4 captioned figures, 2 tables, authors from page 12,
published version, figures at
http://aliceinfo.cern.ch/ArtSubmission/node/284
Measurement of charm production at central rapidity in proton-proton collisions at TeV
The -differential production cross sections of the prompt (B
feed-down subtracted) charmed mesons D, D, and D in the rapidity
range , and for transverse momentum GeV/, were
measured in proton-proton collisions at TeV with the ALICE
detector at the Large Hadron Collider. The analysis exploited the hadronic
decays DK, DK, DD, and their charge conjugates, and was performed on a
nb event sample collected in 2011 with a
minimum-bias trigger. The total charm production cross section at TeV and at 7 TeV was evaluated by extrapolating to the full phase space
the -differential production cross sections at TeV
and our previous measurements at TeV. The results were compared
to existing measurements and to perturbative-QCD calculations. The fraction of
cdbar D mesons produced in a vector state was also determined.Comment: 20 pages, 5 captioned figures, 4 tables, authors from page 15,
published version, figures at
http://aliceinfo.cern.ch/ArtSubmission/node/307
- …