988 research outputs found
Bell inequalities from variable elimination methods
Tight Bell inequalities are facets of Pitowsky's correlation polytope and are
usually obtained from its extreme points by solving the hull problem. Here we
present an alternative method based on a combination of algebraic results on
extensions of measures and variable elimination methods, e.g., the
Fourier-Motzkin method. Our method is shown to overcome some of the
computational difficulties associated with the hull problem in some non-trivial
cases. Moreover, it provides an explanation for the arising of only a finite
number of families of Bell inequalities in measurement scenarios where one
experimenter can choose between an arbitrary number of different measurements
A robust pseudo-inverse spectral filter applied to the Earth Radiation Budget Experiment (ERBE) scanning channels
Computer simulations of a least squares estimator operating on the ERBE scanning channels are discussed. The estimator is designed to minimize the errors produced by nonideal spectral response to spectrally varying and uncertain radiant input. The three ERBE scanning channels cover a shortwave band a longwave band and a ""total'' band from which the pseudo inverse spectral filter estimates the radiance components in the shortwave band and a longwave band. The radiance estimator draws on instantaneous field of view (IFOV) scene type information supplied by another algorithm of the ERBE software, and on a priori probabilistic models of the responses of the scanning channels to the IFOV scene types for given Sun scene spacecraft geometry. It is found that the pseudoinverse spectral filter is stable, tolerant of errors in scene identification and in channel response modeling, and, in the absence of such errors, yields minimum variance and essentially unbiased radiance estimates
Nonlocality as a Benchmark for Universal Quantum Computation in Ising Anyon Topological Quantum Computers
An obstacle affecting any proposal for a topological quantum computer based
on Ising anyons is that quasiparticle braiding can only implement a finite
(non-universal) set of quantum operations. The computational power of this
restricted set of operations (often called stabilizer operations) has been
studied in quantum information theory, and it is known that no
quantum-computational advantage can be obtained without the help of an
additional non-stabilizer operation. Similarly, a bipartite two-qubit system
based on Ising anyons cannot exhibit non-locality (in the sense of violating a
Bell inequality) when only topologically protected stabilizer operations are
performed. To produce correlations that cannot be described by a local hidden
variable model again requires the use of a non-stabilizer operation. Using
geometric techniques, we relate the sets of operations that enable universal
quantum computing (UQC) with those that enable violation of a Bell inequality.
Motivated by the fact that non-stabilizer operations are expected to be highly
imperfect, our aim is to provide a benchmark for identifying UQC-enabling
operations that is both experimentally practical and conceptually simple. We
show that any (noisy) single-qubit non-stabilizer operation that, together with
perfect stabilizer operations, enables violation of the simplest two-qubit Bell
inequality can also be used to enable UQC. This benchmarking requires finding
the expectation values of two distinct Pauli measurements on each qubit of a
bipartite system.Comment: 12 pages, 2 figure
On the Relationship between Convex Bodies Related to Correlation Experiments with Dichotomic Observables
In this paper we explore further the connections between convex bodies
related to quantum correlation experiments with dichotomic variables and
related bodies studied in combinatorial optimization, especially cut polyhedra.
Such a relationship was established in Avis, Imai, Ito and Sasaki (2005 J.
Phys. A: Math. Gen. 38 10971-87) with respect to Bell inequalities. We show
that several well known bodies related to cut polyhedra are equivalent to
bodies such as those defined by Tsirelson (1993 Hadronic J. S. 8 329-45) to
represent hidden deterministic behaviors, quantum behaviors, and no-signalling
behaviors. Among other things, our results allow a unique representation of
these bodies, give a necessary condition for vertices of the no-signalling
polytope, and give a method for bounding the quantum violation of Bell
inequalities by means of a body that contains the set of quantum behaviors.
Optimization over this latter body may be performed efficiently by semidefinite
programming. In the second part of the paper we apply these results to the
study of classical correlation functions. We provide a complete list of tight
inequalities for the two party case with (m,n) dichotomic observables when
m=4,n=4 and when min{m,n}<=3, and give a new general family of correlation
inequalities.Comment: 17 pages, 2 figure
Recommended from our members
Authenticity: How do counselling psychologists know who their clients really are?
Counselling psychology trainees are obliged to undertake a minimum of 40 hours of personal therapy as part of the course requirements. This qualitative study explores how trainee counselling psychologists experience mandatory personal therapy and how chartered counselling psychologists experience having trainee counselling psychologists as clients. Phenomenological methodology - specifically, Interpretative Phenomenological Analysis (lPA) - was employed to access the lived experience of both trainees and qualified psychologists. Analysis of the results suggests that as the therapeutic relationship develops, trainee counselling psychologists move from an 'inauthentic' to an 'authentic' self. They use mandatory personal therapy to learn and grow both professionally and personally. Whilst many trainees feel that therapy should remain a compulsory course requirement, they also highlight that it costs them both emotionally and financially. The qualified therapists notice a difference when working with trainee counselling psychologists, as opposed to their other clients. The therapists are aware of the mandatory nature of the therapy and their own worries about being judged by the trainees. They find it difficult to maintain the 'role' of therapist. The therapists both empathise and sympathise with the trainees, which often results in concessions being made. There are four overarching categories common to the two groups: i. impact of mandatory therapy on therapeutic process, ii. the therapeutic performance, iii. the value of therapy and iv. boundaries. Despite both groups stating that the obligatory nature of the therapy initially impedes the process, neither trainees nor therapists communicate this belief within the relationship; often resulting in 'an elephant in the room .' Recommendations are discussed including the value of providing preparation for both trainees and qualified therapists before entering the unique trainee therapeutic relationship, extra funding, and other personal development ideas
Noise Thresholds for Higher Dimensional Systems using the Discrete Wigner Function
For a quantum computer acting on d-dimensional systems, we analyze the
computational power of circuits wherein stabilizer operations are perfect and
we allow access to imperfect non-stabilizer states or operations. If the noise
rate affecting the non-stabilizer resource is sufficiently high, then these
states and operations can become simulable in the sense of the Gottesman-Knill
theorem, reducing the overall power of the circuit to no better than classical.
In this paper we find the depolarizing noise rate at which this happens, and
consequently the most robust non-stabilizer states and non-Clifford gates. In
doing so, we make use of the discrete Wigner function and derive facets of the
so-called qudit Clifford polytope i.e. the inequalities defining the convex
hull of all qudit Clifford gates. Our results for robust states are provably
optimal. For robust gates we find a critical noise rate that, as dimension
increases, rapidly approaches the the theoretical optimum of 100%. Some
connections with the question of qudit magic state distillation are discussed.Comment: 14 pages, 1 table; Minor changes vs. version
Pruning Algorithms for Pretropisms of Newton Polytopes
Pretropisms are candidates for the leading exponents of Puiseux series that
represent solutions of polynomial systems. To find pretropisms, we propose an
exact gift wrapping algorithm to prune the tree of edges of a tuple of Newton
polytopes. We prefer exact arithmetic not only because of the exact input and
the degrees of the output, but because of the often unpredictable growth of the
coordinates in the face normals, even for polytopes in generic position. We
provide experimental results with our preliminary implementation in Sage that
compare favorably with the pruning method that relies only on cone
intersections.Comment: exact, gift wrapping, Newton polytope, pretropism, tree pruning,
accepted for presentation at Computer Algebra in Scientific Computing, CASC
201
Polynomial Delay Algorithm for Listing Minimal Edge Dominating sets in Graphs
The Transversal problem, i.e, the enumeration of all the minimal transversals
of a hypergraph in output-polynomial time, i.e, in time polynomial in its size
and the cumulated size of all its minimal transversals, is a fifty years old
open problem, and up to now there are few examples of hypergraph classes where
the problem is solved. A minimal dominating set in a graph is a subset of its
vertex set that has a non empty intersection with the closed neighborhood of
every vertex. It is proved in [M. M. Kant\'e, V. Limouzy, A. Mary, L. Nourine,
On the Enumeration of Minimal Dominating Sets and Related Notions, In Revision
2014] that the enumeration of minimal dominating sets in graphs and the
enumeration of minimal transversals in hypergraphs are two equivalent problems.
Hoping this equivalence can help to get new insights in the Transversal
problem, it is natural to look inside graph classes. It is proved independently
and with different techniques in [Golovach et al. - ICALP 2013] and [Kant\'e et
al. - ISAAC 2012] that minimal edge dominating sets in graphs (i.e, minimal
dominating sets in line graphs) can be enumerated in incremental
output-polynomial time. We provide the first polynomial delay and polynomial
space algorithm that lists all the minimal edge dominating sets in graphs,
answering an open problem of [Golovach et al. - ICALP 2013]. Besides the
result, we hope the used techniques that are a mix of a modification of the
well-known Berge's algorithm and a strong use of the structure of line graphs,
are of great interest and could be used to get new output-polynomial time
algorithms.Comment: proofs simplified from previous version, 12 pages, 2 figure
Bulk versus boundary quantum states
An explicit holographic correspondence between bulk and boundary
quantum states is found in the form of a one to one mapping between scalar
field creation/annihilation operators. The mapping requires the introduction of
arbitrary energy scales and exhibits an ultraviolet-infrared duality: a small
regulating mass in the boundary theory corresponds to a large momentum cutoff
in the bulk. In the massless (conformal) limit of the boundary theory the
mapping covers the whole field spectrum of both theories. The mapping strongly
depends on the discretization of the field spectrum of compactified space
in Poincare coordinates.Comment: Minor changes in the text. Typos correction. References added.
Version to appear in Phys. Lett.
- …