13,199 research outputs found
Approximating incompatible von Neumann measurements simultaneously
We study the problem of performing orthogonal qubit measurements
simultaneously. Since these measurements are incompatible, one has to accept
additional imprecision. An optimal joint measurement is the one with the least
possible imprecision. All earlier considerations of this problem have concerned
only joint measurability of observables, while in this work we also take into
account conditional state transformations (i.e., instruments). We characterize
the optimal joint instrument for two orthogonal von Neumann instruments as
being the Luders instrument of the optimal joint observable.Comment: 9 pages, 4 figures; v2 has a more extensive introduction + other
minor correction
The structure of classical extensions of quantum probability theory
On the basis of a suggestive definition of a classical extension of quantum mechanics in terms of statistical models, we prove that every such classical extension is essentially given by the so-called MisraâBugajski reduction map. We consider how this map enables one to understand quantum mechanics as a reduced classical statistical theory on the projective Hilbert space as phase space and discuss features of the induced hidden-variable model. Moreover, some relevant technical results on the topology and Borel structure of the projective Hilbert space are reviewed
Quantum Mechanics as a Framework for Dealing with Uncertainty
Quantum uncertainty is described here in two guises: indeterminacy with its
concomitant indeterminism of measurement outcomes, and fuzziness, or
unsharpness. Both features were long seen as obstructions of experimental
possibilities that were available in the realm of classical physics. The birth
of quantum information science was due to the realization that such
obstructions can be turned into powerful resources. Here we review how the
utilization of quantum fuzziness makes room for a notion of approximate joint
measurement of noncommuting observables. We also show how from a classical
perspective quantum uncertainty is due to a limitation of measurability
reflected in a fuzzy event structure -- all quantum events are fundamentally
unsharp.Comment: Plenary Lecture, Central European Workshop on Quantum Optics, Turku
2009
The Standard Model of Quantum Measurement Theory: History and Applications
The standard model of the quantum theory of measurement is based on an
interaction Hamiltonian in which the observable-to-be-measured is multiplied
with some observable of a probe system. This simple Ansatz has proved extremely
fruitful in the development of the foundations of quantum mechanics. While the
ensuing type of models has often been argued to be rather artificial, recent
advances in quantum optics have demonstrated their prinicpal and practical
feasibility. A brief historical review of the standard model together with an
outline of its virtues and limitations are presented as an illustration of the
mutual inspiration that has always taken place between foundational and
experimental research in quantum physics.Comment: 22 pages, to appear in Found. Phys. 199
Unsharp Quantum Reality
The positive operator (valued) measures (POMs) allow one to generalize the notion of observable beyond the traditional one based on projection valued measures (PVMs). Here, we argue that this generalized conception of observable enables a consistent notion of unsharp reality and with it an adequate concept of joint properties. A sharp or unsharp property manifests itself as an element of sharp or unsharp reality by its tendency to become actual or to actualize a specific measurement outcome. This actualization tendency-or potentiality-of a property is quantified by the associated quantum probability. The resulting single-case interpretation of probability as a degree of reality will be explained in detail and its role in addressing the tensions between quantum and classical accounts of the physical world will be elucidated. It will be shown that potentiality can be viewed as a causal agency that evolves in a well-defined way
Decomposition and primary crystallization in undercooled Zr41.2Ti13.8Cu12.5Ni10.0Be22.5 melts
Zr41.2Ti13.8Cu12.5Ni10.0Be22.5 bulk metallic glasses were prepared by cooling the melt with a rate of about 10 K/s and investigated with respect to their chemical and structural homogeneity by atom probe field ion microscopy and transmission electron microscopy. The measurements on these slowly cooled samples reveal that the alloy exhibits phase separation in the undercooled liquid state. Significant composition fluctuations are found in the Be and Zr concentration but not in the Ti, Cu, and Ni concentration. The decomposed microstructure is compared with the microstructure obtained upon primary crystallization, suggesting that the nucleation during primary crystallization of this bulk glass former is triggered by the preceding diffusion controlled decomposition in the undercooled liquid state
- âŠ