2,674 research outputs found
Modelling and analysing user views of telecommunications services
User views of calls are modelled by behaviour trees, which are synchronised to form a network of users. High level presentations of the models are given using process algebra and an explicit theory of features, including precedences. These precedences abstractly encapsulate the possible state spaces which result from different combinations of features.
The high level presentation supports incremental development of features and testing and experimentation through animation. Interactions which are not detected during the experimentation phase may be found through static analysis of the high level presentation, through dynamic analysis of the under-lying low level transition system, and through verification of temporal properties through model-checking. In each case, interactions are resolved through manipulation of the feature precedences
Purification-based metric to measure the distance between quantum states and processes
In this work we study the properties of an purification-based entropic metric
for measuring the distance between both quantum states and quantum processes.
This metric is defined as the square root of the entropy of the average of two
purifications of mixed quantum states which maximize the overlap between the
purified states. We analyze this metric and show that it satisfies many
appealing properties, which suggest this metric is an interesting proposal for
theoretical and experimental applications of quantum information.Comment: 11 pages, 2 figures. arXiv admin note: text overlap with
arXiv:quant-ph/0408063, arXiv:1107.1732 by other author
False discovery rate regression: an application to neural synchrony detection in primary visual cortex
Many approaches for multiple testing begin with the assumption that all tests
in a given study should be combined into a global false-discovery-rate
analysis. But this may be inappropriate for many of today's large-scale
screening problems, where auxiliary information about each test is often
available, and where a combined analysis can lead to poorly calibrated error
rates within different subsets of the experiment. To address this issue, we
introduce an approach called false-discovery-rate regression that directly uses
this auxiliary information to inform the outcome of each test. The method can
be motivated by a two-groups model in which covariates are allowed to influence
the local false discovery rate, or equivalently, the posterior probability that
a given observation is a signal. This poses many subtle issues at the interface
between inference and computation, and we investigate several variations of the
overall approach. Simulation evidence suggests that: (1) when covariate effects
are present, FDR regression improves power for a fixed false-discovery rate;
and (2) when covariate effects are absent, the method is robust, in the sense
that it does not lead to inflated error rates. We apply the method to neural
recordings from primary visual cortex. The goal is to detect pairs of neurons
that exhibit fine-time-scale interactions, in the sense that they fire together
more often than expected due to chance. Our method detects roughly 50% more
synchronous pairs versus a standard FDR-controlling analysis. The companion R
package FDRreg implements all methods described in the paper
Quantum entanglement
All our former experience with application of quantum theory seems to say:
{\it what is predicted by quantum formalism must occur in laboratory}. But the
essence of quantum formalism - entanglement, recognized by Einstein, Podolsky,
Rosen and Schr\"odinger - waited over 70 years to enter to laboratories as a
new resource as real as energy.
This holistic property of compound quantum systems, which involves
nonclassical correlations between subsystems, is a potential for many quantum
processes, including ``canonical'' ones: quantum cryptography, quantum
teleportation and dense coding. However, it appeared that this new resource is
very complex and difficult to detect. Being usually fragile to environment, it
is robust against conceptual and mathematical tools, the task of which is to
decipher its rich structure.
This article reviews basic aspects of entanglement including its
characterization, detection, distillation and quantifying. In particular, the
authors discuss various manifestations of entanglement via Bell inequalities,
entropic inequalities, entanglement witnesses, quantum cryptography and point
out some interrelations. They also discuss a basic role of entanglement in
quantum communication within distant labs paradigm and stress some
peculiarities such as irreversibility of entanglement manipulations including
its extremal form - bound entanglement phenomenon. A basic role of entanglement
witnesses in detection of entanglement is emphasized.Comment: 110 pages, 3 figures, ReVTex4, Improved (slightly extended)
presentation, updated references, minor changes, submitted to Rev. Mod. Phys
- …