5,660 research outputs found
Extra Shared Entanglement Reduces Memory Demand in Quantum Convolutional Coding
We show how extra entanglement shared between sender and receiver reduces the
memory requirements for a general entanglement-assisted quantum convolutional
code. We construct quantum convolutional codes with good error-correcting
properties by exploiting the error-correcting properties of an arbitrary basic
set of Pauli generators. The main benefit of this particular construction is
that there is no need to increase the frame size of the code when extra shared
entanglement is available. Then there is no need to increase the memory
requirements or circuit complexity of the code because the frame size of the
code is directly related to these two code properties. Another benefit, similar
to results of previous work in entanglement-assisted convolutional coding, is
that we can import an arbitrary classical quaternary code for use as an
entanglement-assisted quantum convolutional code. The rate and error-correcting
properties of the imported classical code translate to the quantum code. We
provide an example that illustrates how to import a classical quaternary code
for use as an entanglement-assisted quantum convolutional code. We finally show
how to "piggyback" classical information to make use of the extra shared
entanglement in the code.Comment: 7 pages, 1 figure, accepted for publication in Physical Review
Joint source-channel coding for a quantum multiple access channel
Suppose that two senders each obtain one share of the output of a classical,
bivariate, correlated information source. They would like to transmit the
correlated source to a receiver using a quantum multiple access channel. In
prior work, Cover, El Gamal, and Salehi provided a combined source-channel
coding strategy for a classical multiple access channel which outperforms the
simpler "separation" strategy where separate codebooks are used for the source
coding and the channel coding tasks. In the present paper, we prove that a
coding strategy similar to the Cover-El Gamal-Salehi strategy and a
corresponding quantum simultaneous decoder allow for the reliable transmission
of a source over a quantum multiple access channel, as long as a set of
information inequalities involving the Holevo quantity hold.Comment: 21 pages, v2: minor changes, accepted into Journal of Physics
Quantum discord and classical correlation can tighten the uncertainty principle in the presence of quantum memory
Uncertainty relations capture the essence of the inevitable randomness
associated with the outcomes of two incompatible quantum measurements.
Recently, Berta et al. have shown that the lower bound on the uncertainties of
the measurement outcomes depends on the correlations between the observed
system and an observer who possesses a quantum memory. If the system is
maximally entangled with its memory, the outcomes of two incompatible
measurements made on the system can be predicted precisely. Here, we obtain a
new uncertainty relation that tightens the lower bound of Berta et al., by
incorporating an additional term that depends on the quantum discord and the
classical correlations of the joint state of the observed system and the
quantum memory. We discuss several examples of states for which our new lower
bound is tighter than the bound of Berta et al. On the application side, we
discuss the relevance of our new inequality for the security of quantum key
distribution and show that it can be used to provide bounds on the distillable
common randomness and the entanglement of formation of bipartite quantum
states.Comment: v1: Latex, 4 and half pages, one fig; v2: 9 pages including 4-page
appendix; v3: accepted into Physical Review A with minor change
Virtual Data in CMS Analysis
The use of virtual data for enhancing the collaboration between large groups
of scientists is explored in several ways:
- by defining ``virtual'' parameter spaces which can be searched and shared
in an organized way by a collaboration of scientists in the course of their
analysis;
- by providing a mechanism to log the provenance of results and the ability
to trace them back to the various stages in the analysis of real or simulated
data;
- by creating ``check points'' in the course of an analysis to permit
collaborators to explore their own analysis branches by refining selections,
improving the signal to background ratio, varying the estimation of parameters,
etc.;
- by facilitating the audit of an analysis and the reproduction of its
results by a different group, or in a peer review context.
We describe a prototype for the analysis of data from the CMS experiment
based on the virtual data system Chimera and the object-oriented data analysis
framework ROOT. The Chimera system is used to chain together several steps in
the analysis process including the Monte Carlo generation of data, the
simulation of detector response, the reconstruction of physics objects and
their subsequent analysis, histogramming and visualization using the ROOT
framework.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 9 pages, LaTeX, 7 eps figures. PSN
TUAT010. V2 - references adde
Work-distribution quantumness and irreversibility when crossing a quantum phase transition in finite time
The thermodynamic behavior of out-of-equilibrium quantum systems in finite-time dynamics encompasses the description of energy fluctuations, which dictates a series of the system's physical properties. In addition, strong interactions in many-body systems strikingly affect the energy-fluctuation statistics along a nonequilibrium dynamics. By driving transient currents to oppose the precursor to the metal-Mott-insulator transition in a diversity of dynamical regimes, we show how increasing many-body interactions dramatically affect the statistics of energy fluctuations and, consequently, the extractable work distribution of finite Hubbard chains. Statistical properties of such distributions as its skewness with its impressive change across the transition can be related to irreversibility and entropy production. Even for slow driving rates, the quasi quantum phase transition hinders equilibration, increasing the process irreversibility, and inducing strong features in the work distribution. In the Mott-insulating phase, the work fluctuation-dissipation balance gets modified with the irreversible entropy production dominating over work fluctuations. Because of this, effects of an interaction-driven quantum phase transition on thermodynamic quantities and irreversibility must be considered in the design of protocols in small-scale devices for application in quantum technology. Eventually, such many-body effects can also be employed in work extraction and refrigeration protocols on a quantum scale
Fluctuations of the local density of states probe localized surface plasmons on disordered metal films
We measure the statistical distribution of the local density of optical
states (LDOS) on disordered semi-continuous metal films. We show that LDOS
fluctuations exhibit a maximum in a regime where fractal clusters dominate the
film surface. These large fluctuations are a signature of surface-plasmon
localization on the nanometer scale
Quantum correlations in the temporal CHSH scenario
We consider a temporal version of the CHSH scenario using projective
measurements on a single quantum system. It is known that quantum correlations
in this scenario are fundamentally more general than correlations obtainable
with the assumptions of macroscopic realism and non-invasive measurements. In
this work, we also educe some fundamental limitations of these quantum
correlations. One result is that a set of correlators can appear in the
temporal CHSH scenario if and only if it can appear in the usual spatial CHSH
scenario. In particular, we derive the validity of the Tsirelson bound and the
impossibility of PR-box behavior. The strength of possible signaling also turns
out to be surprisingly limited, giving a maximal communication capacity of
approximately 0.32 bits. We also find a temporal version of Hardy's nonlocality
paradox with a maximal quantum value of 1/4.Comment: corrected versio
Geothermal probabilistic cost study
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined
Using Pilot Systems to Execute Many Task Workloads on Supercomputers
High performance computing systems have historically been designed to support
applications comprised of mostly monolithic, single-job workloads. Pilot
systems decouple workload specification, resource selection, and task execution
via job placeholders and late-binding. Pilot systems help to satisfy the
resource requirements of workloads comprised of multiple tasks. RADICAL-Pilot
(RP) is a modular and extensible Python-based pilot system. In this paper we
describe RP's design, architecture and implementation, and characterize its
performance. RP is capable of spawning more than 100 tasks/second and supports
the steady-state execution of up to 16K concurrent tasks. RP can be used
stand-alone, as well as integrated with other application-level tools as a
runtime system
- …