2,734 research outputs found
The conditions for quantum violation of macroscopic realism
Why do we not experience a violation of macroscopic realism in every-day
life? Normally, no violation can be seen either because of decoherence or the
restriction of coarse-grained measurements, transforming the time evolution of
any quantum state into a classical time evolution of a statistical mixture. We
find the sufficient condition for these classical evolutions for spin systems
under coarse-grained measurements. Then we demonstrate that there exist
"non-classical" Hamiltonians whose time evolution cannot be understood
classically, although at every instant of time the quantum spin state appears
as a classical mixture. We suggest that such Hamiltonians are unlikely to be
realized in nature because of their high computational complexity.Comment: 4 pages, 2 figures, revised version, journal reference adde
Addressing the clumsiness loophole in a Leggett-Garg test of macrorealism
The rise of quantum information theory has lent new relevance to experimental
tests for non-classicality, particularly in controversial cases such as
adiabatic quantum computing superconducting circuits. The Leggett-Garg
inequality is a "Bell inequality in time" designed to indicate whether a single
quantum system behaves in a macrorealistic fashion. Unfortunately, a violation
of the inequality can only show that the system is either (i)
non-macrorealistic or (ii) macrorealistic but subjected to a measurement
technique that happens to disturb the system. The "clumsiness" loophole (ii)
provides reliable refuge for the stubborn macrorealist, who can invoke it to
brand recent experimental and theoretical work on the Leggett-Garg test
inconclusive. Here, we present a revised Leggett-Garg protocol that permits one
to conclude that a system is either (i) non-macrorealistic or (ii)
macrorealistic but with the property that two seemingly non-invasive
measurements can somehow collude and strongly disturb the system. By providing
an explicit check of the invasiveness of the measurements, the protocol
replaces the clumsiness loophole with a significantly smaller "collusion"
loophole.Comment: 7 pages, 3 figure
Classical world arising out of quantum physics under the restriction of coarse-grained measurements
Conceptually different from the decoherence program, we present a novel
theoretical approach to macroscopic realism and classical physics within
quantum theory. It focuses on the limits of observability of quantum effects of
macroscopic objects, i.e., on the required precision of our measurement
apparatuses such that quantum phenomena can still be observed. First, we
demonstrate that for unrestricted measurement accuracy no classical description
is possible for arbitrarily large systems. Then we show for a certain time
evolution that under coarse-grained measurements not only macrorealism but even
the classical Newtonian laws emerge out of the Schroedinger equation and the
projection postulate.Comment: 4 pages, 1 figure, second revised and published versio
Logical independence and quantum randomness
We propose a link between logical independence and quantum physics. We
demonstrate that quantum systems in the eigenstates of Pauli group operators
are capable of encoding mathematical axioms and show that Pauli group quantum
measurements are capable of revealing whether or not a given proposition is
logically dependent on the axiomatic system. Whenever a mathematical
proposition is logically independent of the axioms encoded in the measured
state, the measurement associated with the proposition gives random outcomes.
This allows for an experimental test of logical independence. Conversely, it
also allows for an explanation of the probabilities of random outcomes observed
in Pauli group measurements from logical independence without invoking quantum
theory. The axiomatic systems we study can be completed and are therefore not
subject to Goedel's incompleteness theorem.Comment: 9 pages, 4 figures, published version plus additional experimental
appendi
Entanglement between smeared field operators in the Klein-Gordon vacuum
Quantum field theory is the application of quantum physics to fields. It
provides a theoretical framework widely used in particle physics and condensed
matter physics. One of the most distinct features of quantum physics with
respect to classical physics is entanglement or the existence of strong
correlations between subsystems that can even be spacelike separated. In
quantum fields, observables restricted to a region of space define a subsystem.
While there are proofs on the existence of local observables that would allow a
violation of Bell's inequalities in the vacuum states of quantum fields as well
as some explicit but technically demanding schemes requiring an extreme
fine-tuning of the interaction between the fields and detectors, an
experimentally accessible entanglement witness for quantum fields is still
missing. Here we introduce smeared field operators which allow reducing the
vacuum to a system of two effective bosonic modes. The introduction of such
collective observables is motivated by the fact that no physical probe has
access to fields in single spatial (mathematical) points but rather smeared
over finite volumes. We first give explicit collective observables whose
correlations reveal vacuum entanglement in the Klein-Gordon field. We then show
that the critical distance between the two regions of space above which two
effective bosonic modes become separable is of the order of the Compton
wavelength of the particle corresponding to the massive Klein-Gordon field.Comment: 21 pages, 11 figure
The late flowering of invasive species contributes to the increase of Artemisia allergenic pollen in autumn: an analysis of 25 years of aerobiological data (1995–2019) in Trentino-Alto Adige (Northern Italy)
Artemisia pollen is an important aeroallergen in late summer, especially in central and eastern Europe where distinct anemophilous Artemisia spp. produce high amounts of pollen grains. The study aims at: (i) analyzing the temporal pattern of and changes in the Artemisia spp. pollen season; (ii) identifying the Artemisia species responsible for the local airborne pollen load.
Daily pollen concentration of Artemisia spp. was analyzed at two sites (BZ and SM) in Trentino-Alto Adige, North Italy, from 1995 to 2019.
The analysis of airborne Artemisia pollen concentrations evidences the presence of a bimodal curve, with two peaks, in August and September, respectively. The magnitude of peak concentrations varies across the studied time span for both sites: the maximum concentration at the September peak increases significantly for both the BZ (p < 0.05) and SM (p < 0.001) site. The first peak in the pollen calendar is attributable to native Artemisia species, with A. vulgaris as the most abundant; the second peak is mostly represented by the invasive species A. annua and A. verlotiorum (in constant proportion along the years), which are causing a considerable increase in pollen concentration in the late pollen season in recent years.. The spread of these species can affect human health, increasing the length and severity of allergenic pollen exposure in autumn, as well as plant biodiversity in both natural and cultivated areas, with negative impacts on, e.g., Natura 2000 protected sites and crops
Neural networks-based regularization for large-scale medical image reconstruction
In this paper we present a generalized Deep Learning-based approach for solving ill-posed large-scale inverse problems occuring in medical image reconstruction. Recently, Deep Learning methods using iterative neural networks (NNs) and cascaded NNs have been reported to achieve state-of-the-art results with respect to various quantitative quality measures as PSNR, NRMSE and SSIM across different imaging modalities. However, the fact that these approaches employ the application of the forward and adjoint operators repeatedly in the network architecture requires the network to process the whole images or volumes at once, which for some applications is computationally infeasible. In this work, we follow a different reconstruction strategy by strictly separating the application of the NN, the regularization of the solution and the consistency with the measured data. The regularization is given in the form of an image prior obtained by the output of a previously trained NN which is used in a Tikhonov regularization framework. By doing so, more complex and sophisticated network architectures can be used for the removal of the artefacts or noise than it is usually the case in iterative NNs. Due to the large scale of the considered problems and the resulting computational complexity of the employed networks, the priors are obtained by processing the images or volumes as patches or slices. We evaluated the method for the cases of 3D cone-beam low dose CT and undersampled 2D radial cine MRI and compared it to a total variation-minimization-based reconstruction algorithm as well as to a method with regularization based on learned overcomplete dictionaries. The proposed method outperformed all the reported methods with respect to all chosen quantitative measures and further accelerates the regularization step in the reconstruction by several orders of magnitude
- …