2,379 research outputs found
Twin inequality for fully contextual quantum correlations
Quantum mechanics exhibits a very peculiar form of contextuality. Identifying
and connecting the simplest scenarios in which more general theories can or
cannot be more contextual than quantum mechanics is a fundamental step in the
quest for the principle that singles out quantum contextuality. The former
scenario corresponds to the Klyachko-Can-Binicioglu-Shumovsky (KCBS)
inequality. Here we show that there is a simple tight inequality, twin to the
KCBS, for which quantum contextuality cannot be outperformed. In a sense, this
twin inequality is the simplest tool for recognizing fully contextual quantum
correlations.Comment: REVTeX4, 4 pages, 1 figur
On Hardness of the Joint Crossing Number
The Joint Crossing Number problem asks for a simultaneous embedding of two
disjoint graphs into one surface such that the number of edge crossings
(between the two graphs) is minimized. It was introduced by Negami in 2001 in
connection with diagonal flips in triangulations of surfaces, and subsequently
investigated in a general form for small-genus surfaces. We prove that all of
the commonly considered variants of this problem are NP-hard already in the
orientable surface of genus 6, by a reduction from a special variant of the
anchored crossing number problem of Cabello and Mohar
Finite-precision measurement does not nullify the Kochen-Specker theorem
It is proven that any hidden variable theory of the type proposed by Meyer
[Phys. Rev. Lett. {\bf 83}, 3751 (1999)], Kent [{\em ibid.} {\bf 83}, 3755
(1999)], and Clifton and Kent [Proc. R. Soc. London, Ser. A {\bf 456}, 2101
(2000)] leads to experimentally testable predictions that are in contradiction
with those of quantum mechanics. Therefore, it is argued that the existence of
dense Kochen-Specker-colorable sets must not be interpreted as a nullification
of the physical impact of the Kochen-Specker theorem once the finite precision
of real measurements is taken into account.Comment: REVTeX4, 5 page
Quantum state-independent contextuality requires 13 rays
We show that, regardless of the dimension of the Hilbert space, there exists
no set of rays revealing state-independent contextuality with less than 13
rays. This implies that the set proposed by Yu and Oh in dimension three [Phys.
Rev. Lett. 108, 030402 (2012)] is actually the minimal set in quantum theory.
This contrasts with the case of Kochen-Specker sets, where the smallest set
occurs in dimension four.Comment: 8 pages, 2 tables, 1 figure, v2: minor change
Implications of quantum automata for contextuality
We construct zero-error quantum finite automata (QFAs) for promise problems
which cannot be solved by bounded-error probabilistic finite automata (PFAs).
Here is a summary of our results:
- There is a promise problem solvable by an exact two-way QFA in exponential
expected time, but not by any bounded-error sublogarithmic space probabilistic
Turing machine (PTM).
- There is a promise problem solvable by an exact two-way QFA in quadratic
expected time, but not by any bounded-error -space PTMs in
polynomial expected time. The same problem can be solvable by a one-way Las
Vegas (or exact two-way) QFA with quantum head in linear (expected) time.
- There is a promise problem solvable by a Las Vegas realtime QFA, but not by
any bounded-error realtime PFA. The same problem can be solvable by an exact
two-way QFA in linear expected time but not by any exact two-way PFA.
- There is a family of promise problems such that each promise problem can be
solvable by a two-state exact realtime QFAs, but, there is no such bound on the
number of states of realtime bounded-error PFAs solving the members this
family.
Our results imply that there exist zero-error quantum computational devices
with a \emph{single qubit} of memory that cannot be simulated by any finite
memory classical computational model. This provides a computational perspective
on results regarding ontological theories of quantum mechanics \cite{Hardy04},
\cite{Montina08}. As a consequence we find that classical automata based
simulation models \cite{Kleinmann11}, \cite{Blasiak13} are not sufficiently
powerful to simulate quantum contextuality. We conclude by highlighting the
interplay between results from automata models and their application to
developing a general framework for quantum contextuality.Comment: 22 page
Memory cost of quantum contextuality
The simulation of quantum effects requires certain classical resources, and
quantifying them is an important step in order to characterize the difference
between quantum and classical physics. For a simulation of the phenomenon of
state-independent quantum contextuality, we show that the minimal amount of
memory used by the simulation is the critical resource. We derive optimal
simulation strategies for important cases and prove that reproducing the
results of sequential measurements on a two-qubit system requires more memory
than the information carrying capacity of the system.Comment: 18 pages, no figures, v2: revised for clarit
Minimal true-implies-false and true-implies-true sets of propositions in noncontextual hidden variable theories
An essential ingredient in many examples of the conflict between quantum
theory and noncontextual hidden variables (e.g., the proof of the
Kochen-Specker theorem and Hardy's proof of Bell's theorem) is a set of atomic
propositions about the outcomes of ideal measurements such that, when outcome
noncontextuality is assumed, if proposition is true, then, due to
exclusiveness and completeness, a nonexclusive proposition () must be
false (true). We call such a set a {\em true-implies-false set} (TIFS) [{\em
true-implies-true set} (TITS)]. Here we identify all the minimal TIFSs and
TITSs in every dimension , i.e., the sets of each type having the
smallest number of propositions. These sets are important because each of them
leads to a proof of impossibility of noncontextual hidden variables and
corresponds to a simple situation with quantum vs classical advantage.
Moreover, the methods developed to identify them may be helpful to solve some
open problems regarding minimal Kochen-Specker sets.Comment: 9 pages, 7 figure
Kochen-Specker set with seven contexts
The Kochen-Specker (KS) theorem is a central result in quantum theory and has
applications in quantum information. Its proof requires several yes-no tests
that can be grouped in contexts or subsets of jointly measurable tests.
Arguably, the best measure of simplicity of a KS set is the number of contexts.
The smaller this number is, the smaller the number of experiments needed to
reveal the conflict between quantum theory and noncontextual theories and to
get a quantum vs classical outperformance. The original KS set had 132
contexts. Here we introduce a KS set with seven contexts and prove that this is
the simplest KS set that admits a symmetric parity proof.Comment: REVTeX4, 7 pages, 1 figur
Optimization of Convolutional Neural Network ensemble classifiers by Genetic Algorithms
Breast cancer exhibits a high mortality rate and it is the most invasive cancer in women. An analysis from histopathological images could predict this disease. In this way, computational image processing might support this task. In this work a proposal which employes deep learning convolutional neural networks is presented. Then, an ensemble of networks is considered in order to obtain an enhanced recognition performance of the system by the consensus of the networks of the ensemble. Finally, a genetic algorithm is also considered to choose the networks that belong to the ensemble. The proposal has been tested by carrying out several experiments with a set of benchmark images.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
- …