77,429 research outputs found
State-independent all-versus-nothing arguments
Contextuality is a key feature of quantum information that challenges
classical intuitions, providing the basis for constructing explicit proofs of
quantum advantage. While a number of evidences of quantum advantage are based
on the contextuality argument, the definition of contextuality is different in
each research, causing incoherence in the establishment of instant connection
between their results. In this report, we review the mathematical structure of
sheaf-theoretic contextuality and extend this framework to explain
Kochen-Specker type contextuality. We first cover the definitions in
contextuality with detailed examples. Then, we state the all-versus-nothing
(AvN) argument and define a state-independent AvN class. It is shown that
Kochen-Specker type contextuality, or contextuality in a partial closure, can
be translated into this framework by the partial closure of observables under
the multiplication of commuting measurements. Finally, we compare each case of
contextuality in an operator-side view, where the strict hierarchy of
contextuality class in a state-side view seems to merge into the
state-independent AvN class together with the partial closure formalism.
Overall, this report provides a unified interpretation of contextuality by
integrating Kochen-Specker type notions into the state-independent AvN
argument. The results present novel insights into contextuality, which pave the
way for a coherent approach to constructing proofs of quantum advantage.Comment: 34 pages, 7 figures, master's thesis submitted to University College
Londo
The Problem of Confirmation in the Everett Interpretation
I argue that the Oxford school Everett interpretation is internally
incoherent, because we cannot claim that in an Everettian universe the kinds of
reasoning we have used to arrive at our beliefs about quantum mechanics would
lead us to form true beliefs. I show that in an Everettian context, the
experimental evidence that we have available could not provide empirical
confirmation for quantum mechanics, and moreover that we would not even be able
to establish reference to the theoretical entities of quantum mechanics. I then
consider a range of existing Everettian approaches to the probability problem
and show that they do not succeed in overcoming this incoherence
Understanding Deutsch's probability in a deterministic multiverse
Difficulties over probability have often been considered fatal to the Everett
interpretation of quantum mechanics. Here I argue that the Everettian can have
everything she needs from `probability' without recourse to indeterminism,
ignorance, primitive identity over time or subjective uncertainty: all she
needs is a particular *rationality principle*.
The decision-theoretic approach recently developed by Deutsch and Wallace
claims to provide just such a principle. But, according to Wallace, decision
theory is itself applicable only if the correct attitude to a future Everettian
measurement outcome is subjective uncertainty. I argue that subjective
uncertainty is not to be had, but I offer an alternative interpretation that
enables the Everettian to live without uncertainty: we can justify Everettian
decision theory on the basis that an Everettian should *care about* all her
future branches. The probabilities appearing in the decision-theoretic
representation theorem can then be interpreted as the degrees to which the
rational agent cares about each future branch. This reinterpretation, however,
reduces the intuitive plausibility of one of the Deutsch-Wallace axioms
(Measurement Neutrality).Comment: 34 pages (excluding bibliography); no figures. To appear in Studies
in the History and Philosophy of Modern Physics, Septamber 2004. Replaced to
include changes made during referee and editorial review (abstract extended;
arrangement and presentation of material in sections 4.1, 5.3, 5.4 altered
significantly; minor changes elsewhere
Nonlocal quantum information transfer without superluminal signalling and communication
It is a frequent assumption that - via superluminal information transfers -
superluminal signals capable of enabling communication are necessarily
exchanged in any quantum theory that posits hidden superluminal influences.
However, does the presence of hidden superluminal influences automatically
imply superluminal signalling and communication? The non-signalling theorem
mediates the apparent conflict between quantum mechanics and the theory of
special relativity. However, as a 'no-go' theorem there exist two opposing
interpretations of the non-signalling constraint: foundational and operational.
Concerning Bell's theorem, we argue that Bell employed both interpretations at
different times. Bell finally pursued an explicitly operational position on
non-signalling which is often associated with ontological quantum theory, e.g.,
de Broglie-Bohm theory. This position we refer to as "effective
non-signalling". By contrast, associated with orthodox quantum mechanics is the
foundational position referred to here as "axiomatic non-signalling". In search
of a decisive communication-theoretic criterion for differentiating between
"axiomatic" and "effective" non-signalling, we employ the operational framework
offered by Shannon's mathematical theory of communication. We find that an
effective non-signalling theorem represents two sub-theorems, which we call (1)
non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC)
theorem. Employing NTC and NSC theorems, we report that effective, instead of
axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal
communication. An effective non-signalling theorem allows for nonlocal quantum
information transfer yet - at the same time - effectively denies superluminal
signalling and communication.Comment: 21 pages, 5 figures; The article is published with open acces in
Foundations of Physics (2016
Measuring multivariate redundant information with pointwise common change in surprisal
The problem of how to properly quantify redundant information is an open question that has been the subject of much recent research. Redundant information refers to information about a target variable S that is common to two or more predictor variables Xi . It can be thought of as quantifying overlapping information content or similarities in the representation of S between the Xi . We present a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level. We provide a game-theoretic operational definition of unique information, and use this to derive constraints which are used to obtain a maximum entropy distribution. Redundancy is then calculated from this maximum entropy distribution by counting only those local co-information terms which admit an unambiguous interpretation as redundant information. We show how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information into redundant, unique and synergistic contributions. We compare our new measure to existing approaches over a range of example systems, including continuous Gaussian variables. Matlab code for the measure is provided, including all considered examples
Quantum Probability from Subjective Likelihood: improving on Deutsch's proof of the probability rule
I present a proof of the quantum probability rule from decision-theoretic
assumptions, in the context of the Everett interpretation. The basic ideas
behind the proof are those presented in Deutsch's recent proof of the
probability rule, but the proof is simpler and proceeds from weaker
decision-theoretic assumptions. This makes it easier to discuss the conceptual
ideas involved in the proof, and to show that they are defensible.Comment: 23 pages. This is a modified version of my 2003 paper, which
incorporates a completely rewritten and substantially improved proof of
Equivalence as well as a few other more minor change
The Problem of Ignorance
Holly Smith (2014) contends that subjective deontological theories – those that hold that our moral duties are sensitive to our beliefs about our situation – cannot correctly determine whether one ought to gather more information before acting. Against this contention, I argue that deontological theories can use a decision-theoretic approach to evaluating the moral importance of information. I then argue that this approach compares favourably with an alternative approach proposed by Philip Swenson (2016)
Information-theoretic temporal Bell inequality and quantum computation
An information-theoretic temporal Bell inequality is formulated to contrast
classical and quantum computations. Any classical algorithm satisfies the
inequality, while quantum ones can violate it. Therefore, the violation of the
inequality is an immediate consequence of the quantumness in the computation.
Furthermore, this approach suggests a notion of temporal nonlocality in quantum
computation.Comment: v2: 5 pages, refereces added, discussion slightly revised, main
result unchanged. v3: typos correcte
- …