38,834 research outputs found
High-severity wildfire leads to multi-decadal impacts on soil biogeochemistry in mixed-conifer forests.
During the past century, systematic wildfire suppression has decreased fire frequency and increased fire severity in the western United States of America. While this has resulted in large ecological changes aboveground such as altered tree species composition and increased forest density, little is known about the long-term, belowground implications of altered, ecologically novel, fire regimes, especially on soil biological processes. To better understand the long-term implications of ecologically novel, high-severity fire, we used a 44-yr high-severity fire chronosequence in the Sierra Nevada where forests were historically adapted to frequent, low-severity fire, but were fire suppressed for at least 70 yr. High-severity fire in the Sierra Nevada resulted in a long-term (44 +yr) decrease (>50%, P < 0.05) in soil extracellular enzyme activities, basal microbial respiration (56-72%, P < 0.05), and organic carbon (>50%, P < 0.05) in the upper 5 cm compared to sites that had not been burned for at least 115 yr. However, nitrogen (N) processes were only affected in the most recent fire site (4 yr post-fire). Net nitrification increased by over 600% in the most recent fire site (P < 0.001), but returned to similar levels as the unburned control in the 13-yr site. Contrary to previous studies, we did not find a consistent effect of plant cover type on soil biogeochemical processes in mid-successional (10-50 yr) forest soils. Rather, the 44-yr reduction in soil organic carbon (C) quantity correlated positively with dampened C cycling processes. Our results show the drastic and long-term implication of ecologically novel, high-severity fire on soil biogeochemistry and underscore the need for long-term fire ecological experiments
Discrete Wigner functions and quantum computational speedup
In [Phys. Rev. A 70, 062101 (2004)] Gibbons et al. defined a class of
discrete Wigner functions W to represent quantum states in a finite Hilbert
space dimension d. I characterize a set C_d of states having non-negative W
simultaneously in all definitions of W in this class. For d<6 I show C_d is the
convex hull of stabilizer states. This supports the conjecture that negativity
of W is necessary for exponential speedup in pure-state quantum computation.Comment: 7 pages, 2 figures, RevTeX. v2: clarified discussion on dynamics,
added refs., published versio
Realization of the Optimal Universal Quantum Entangler
We present the first experimental demonstration of the ''optimal'' and
''universal'' quantum entangling process involving qubits encoded in the
polarization of single photons. The structure of the ''quantum entangling
machine'' consists of the quantum injected optical parametric amplifier by
which the contextual realization of the 1->2 universal quantum cloning and of
the universal NOT (U-NOT) gate has also been achieved.Comment: 10 pages, 3 figures, to appear in Physical Review
Quantum Preferred Frame: Does It Really Exist?
The idea of the preferred frame as a remedy for difficulties of the
relativistic quantum mechanics in description of the non-local quantum
phenomena was undertaken by such physicists as J. S. Bell and D. Bohm. The
possibility of the existence of preferred frame was also seriously treated by
P. A. M. Dirac. In this paper, we propose an Einstein-Podolsky-Rosen-type
experiment for testing the possible existence of a quantum preferred frame. Our
analysis suggests that to verify whether a preferred frame of reference in the
quantum world exists it is enough to perform an EPR type experiment with pair
of observers staying in the same inertial frame and with use of the massive EPR
pair of spin one-half or spin one particles.Comment: 5 pp., 6 fig
From Einstein's Theorem to Bell's Theorem: A History of Quantum Nonlocality
In this Einstein Year of Physics it seems appropriate to look at an important
aspect of Einstein's work that is often down-played: his contribution to the
debate on the interpretation of quantum mechanics. Contrary to popular opinion,
Bohr had no defence against Einstein's 1935 attack (the EPR paper) on the
claimed completeness of orthodox quantum mechanics. I suggest that Einstein's
argument, as stated most clearly in 1946, could justly be called Einstein's
reality-locality-completeness theorem, since it proves that one of these three
must be false. Einstein's instinct was that completeness of orthodox quantum
mechanics was the falsehood, but he failed in his quest to find a more complete
theory that respected reality and locality. Einstein's theorem, and possibly
Einstein's failure, inspired John Bell in 1964 to prove his reality-locality
theorem. This strengthened Einstein's theorem (but showed the futility of his
quest) by demonstrating that either reality or locality is a falsehood. This
revealed the full nonlocality of the quantum world for the first time.Comment: 18 pages. To be published in Contemporary Physics. (Minor changes;
references and author info added
Fundamental Speed Limits on Quantum Coherence and Correlation Decay
The study and control of coherence in quantum systems is one of the most
exciting recent developments in physics. Quantum coherence plays a crucial role
in emerging quantum technologies as well as fundamental experiments. A major
obstacle to the utilization of quantum effects is decoherence, primarily in the
form of dephasing that destroys quantum coherence, and leads to effective
classical behaviour. We show that there are universal relationships governing
dephasing, which constrain the relative rates at which quantum correlations can
disappear. These effectively lead to speed limits which become especially
important in multi-partite systems
Exploring the extended density-dependent Skyrme effective forces for normal and isospin-rich nuclei to neutron stars
We parameterize the recently proposed generalized Skyrme effective force
(GSEF) containing extended density dependence. The parameters of the GSEF are
determined by the fit to several properties of the normal and isospin-rich
nuclei. We also include in our fit a realistic equation of state for the pure
neutron matter up to high densities so that the resulting Skyrme parameters can
be suitably used to model the neutron star with the "canonical" mass (). For the appropriate comparison we generate a parameter set for the
standard Skyrme effective force (SSEF) using exactly the same set of the data
as employed to determine the parameters of the GSEF. We find that the GSEF
yields larger values for the neutron skin thickness which are closer to the
recent predictions based on the isospin diffusion data. The Skyrme parameters
so obtained are employed to compute the strength function for the isoscalar
giant monopole, dipole and quadrupole resonances. It is found that in the case
of GSEF, due to the the larger value of the nucleon effective mass the values
of centroid energies for the isoscalar giant resonances are in better agreement
with the corresponding experimental data in comparison to those obtained using
the SSEF. We also present results for some of the key properties associated
with the neutron star of "canonical" mass and for the one with the maximum
mass.Comment: 45pages, 16 figure
A New Measurement of Cosmic Ray Composition at the Knee
The Dual Imaging Cerenkov Experiment (DICE) was designed and operated for
making elemental composition measurements of cosmic rays near the knee of the
spectrum at several PeV. Here we present the first results using this
experiment from the measurement of the average location of the depth of shower
maximum, , in the atmosphere as a function of particle energy. The value
of near the instrument threshold of ~0.1 PeV is consistent with
expectations from previous direct measurements. At higher energies there is
little change in composition up to ~5 PeV. Above this energy is deeper
than expected for a constant elemental composition implying the overall
elemental composition is becoming lighter above the knee region. These results
disagree with the idea that cosmic rays should become on average heavier above
the knee. Instead they suggest a transition to a qualitatively different
population of particles above 5 PeV.Comment: 7 pages, LaTeX, two eps figures, aas2pp4.sty and epsf.sty included,
accepted by Ap.J. Let
Perfect Test of Entanglement for Two-level Systems
A 3-setting Bell-type inequality enforced by the indeterminacy relation of
complementary local observables is proposed as an experimental test of the
2-qubit entanglement. The proposed inequality has an advantage of being a
sufficient and necessary criterion of the separability. Therefore any entangled
2-qubit state cannot escape the detection by this kind of tests. It turns out
that the orientation of the local testing observables plays a crucial role in
our perfect detection of the entanglement.Comment: 4 pages, RevTe
Partial Core Transformer for Energization of High Voltage Arc-Signs
paper T3-304A high voltage partial core resonating transformer has been designed and constructed such that its magnetising current reactance is matched to the reactive current drawn by the capacitance of an arc-sign. The supply only provides the real power losses of the transformer plus any reactive power mismatch between the magnetizing reactance and the capacitance of the arc-sign. A mathematical model of the transformer is developed using a reverse design modelling technique. The model is then used to design a 50Hz, 8kVA, 230V/80kV, partial core transformer to meet the required electrical demand of the load. The transformer was constructed and tested. The transformer successfully resonated with the load and provided 68VAr of compensation when operating at 10kV while being supplied from a domestic 230V, 10A, power outlet. The completed transformer has a finished weight of 69kg and has been successfully used for powering an arc-sign at an exhibition of electric sculptures
- …
