33,210 research outputs found
Fundamental Speed Limits on Quantum Coherence and Correlation Decay
The study and control of coherence in quantum systems is one of the most
exciting recent developments in physics. Quantum coherence plays a crucial role
in emerging quantum technologies as well as fundamental experiments. A major
obstacle to the utilization of quantum effects is decoherence, primarily in the
form of dephasing that destroys quantum coherence, and leads to effective
classical behaviour. We show that there are universal relationships governing
dephasing, which constrain the relative rates at which quantum correlations can
disappear. These effectively lead to speed limits which become especially
important in multi-partite systems
Perfect Test of Entanglement for Two-level Systems
A 3-setting Bell-type inequality enforced by the indeterminacy relation of
complementary local observables is proposed as an experimental test of the
2-qubit entanglement. The proposed inequality has an advantage of being a
sufficient and necessary criterion of the separability. Therefore any entangled
2-qubit state cannot escape the detection by this kind of tests. It turns out
that the orientation of the local testing observables plays a crucial role in
our perfect detection of the entanglement.Comment: 4 pages, RevTe
High-severity wildfire leads to multi-decadal impacts on soil biogeochemistry in mixed-conifer forests.
During the past century, systematic wildfire suppression has decreased fire frequency and increased fire severity in the western United States of America. While this has resulted in large ecological changes aboveground such as altered tree species composition and increased forest density, little is known about the long-term, belowground implications of altered, ecologically novel, fire regimes, especially on soil biological processes. To better understand the long-term implications of ecologically novel, high-severity fire, we used a 44-yr high-severity fire chronosequence in the Sierra Nevada where forests were historically adapted to frequent, low-severity fire, but were fire suppressed for at least 70 yr. High-severity fire in the Sierra Nevada resulted in a long-term (44 +yr) decrease (>50%, P < 0.05) in soil extracellular enzyme activities, basal microbial respiration (56-72%, P < 0.05), and organic carbon (>50%, P < 0.05) in the upper 5 cm compared to sites that had not been burned for at least 115 yr. However, nitrogen (N) processes were only affected in the most recent fire site (4 yr post-fire). Net nitrification increased by over 600% in the most recent fire site (P < 0.001), but returned to similar levels as the unburned control in the 13-yr site. Contrary to previous studies, we did not find a consistent effect of plant cover type on soil biogeochemical processes in mid-successional (10-50 yr) forest soils. Rather, the 44-yr reduction in soil organic carbon (C) quantity correlated positively with dampened C cycling processes. Our results show the drastic and long-term implication of ecologically novel, high-severity fire on soil biogeochemistry and underscore the need for long-term fire ecological experiments
Discrete Wigner functions and quantum computational speedup
In [Phys. Rev. A 70, 062101 (2004)] Gibbons et al. defined a class of
discrete Wigner functions W to represent quantum states in a finite Hilbert
space dimension d. I characterize a set C_d of states having non-negative W
simultaneously in all definitions of W in this class. For d<6 I show C_d is the
convex hull of stabilizer states. This supports the conjecture that negativity
of W is necessary for exponential speedup in pure-state quantum computation.Comment: 7 pages, 2 figures, RevTeX. v2: clarified discussion on dynamics,
added refs., published versio
Analysis of surface moisture variations within large field sites
A statistical analysis was made on ground soils to define the general relationship and ranges of values of the field moisture relative to both the variance and coefficient of variation for a given test site and depth increment. The results of the variability study show that: (1) moisture variations within any given large field area are inherent and can either be controlled nor reduced; (2) neither a single value of the standard deviation nor coefficient of variation uniquely define the variability over the complete range of mean field moisture contents examined; and (3) using an upper bound standard deviation parameter clearly defines the maximum range of anticipated moisture variability. 87 percent of all large field moisture content standard deviations were less than 3 percent while about 96 percent of all the computed values had an upper bound of sigma=4 percent for these intensively sampled fields. The limit of accuracy curves of mean soil moisture measurements for large field sites relative to the required number of samples were determined
Macroscopic modelling of the surface tension of polymer-surfactant systems
Polymer-surfactant mixtures are increasingly being used in a wide range of applications. Weakly-interacting systems, such as SDS/PEO and SDS/PVP, comprise ionic surfactants and neutral polymers, while strongly-interacting systems, such as SDS/POLYDMDAAC and C12TAB/NaPSS, comprise ionic surfactants and oppositely charged ionic polymers. The complex nature of interactions in the mixtures leads to interesting and surprising surface tension profiles as the concentrations of polymer and surfactant are varied. The purpose of our research has been to develop a model to explain these surface tension profiles and to understand how they relate to the formation of different complexes in the bulk solution. In this paper we shouw how an existing model based on the law of mass action can be extended to model the surface tension of weakly-interacting systems, and we also extend it further to produce a model for the surface tension of strongly interacting systems. Applying the model to a variety of strongly-interacting systems gives remarkable agreement with the experimental results. The model provides a sound theoretical basis for comparing and contrasting the behaviour of different systems and greatly enhances our understanding of the features observed
From Einstein's Theorem to Bell's Theorem: A History of Quantum Nonlocality
In this Einstein Year of Physics it seems appropriate to look at an important
aspect of Einstein's work that is often down-played: his contribution to the
debate on the interpretation of quantum mechanics. Contrary to popular opinion,
Bohr had no defence against Einstein's 1935 attack (the EPR paper) on the
claimed completeness of orthodox quantum mechanics. I suggest that Einstein's
argument, as stated most clearly in 1946, could justly be called Einstein's
reality-locality-completeness theorem, since it proves that one of these three
must be false. Einstein's instinct was that completeness of orthodox quantum
mechanics was the falsehood, but he failed in his quest to find a more complete
theory that respected reality and locality. Einstein's theorem, and possibly
Einstein's failure, inspired John Bell in 1964 to prove his reality-locality
theorem. This strengthened Einstein's theorem (but showed the futility of his
quest) by demonstrating that either reality or locality is a falsehood. This
revealed the full nonlocality of the quantum world for the first time.Comment: 18 pages. To be published in Contemporary Physics. (Minor changes;
references and author info added
Realization of the Optimal Universal Quantum Entangler
We present the first experimental demonstration of the ''optimal'' and
''universal'' quantum entangling process involving qubits encoded in the
polarization of single photons. The structure of the ''quantum entangling
machine'' consists of the quantum injected optical parametric amplifier by
which the contextual realization of the 1->2 universal quantum cloning and of
the universal NOT (U-NOT) gate has also been achieved.Comment: 10 pages, 3 figures, to appear in Physical Review
Risk and reliability assessment of future power systems
Liberalisation of electricity markets, changing patterns in the generation and use of electricity, and new technologies are some of the factors that result in increased uncertainty about the future operating requirements of an electric power system. In this context, planning for future investments in a power system requires careful consideration of risk and reliability, and of the metrics with which these are measured. This paper highlights the need for consideration of a broader class of approaches to risk and reliability that have hitherto tended not to be an explicit part of the system development process in the electricity industry. We discuss a high level conceptual model that shows sources of uncertainty and modes of control for system operators and planners and offers a broad-brush approach to highlight risks at the planning stage. We argue that there is a need for new risk-informed criteria to help evaluate the necessary investments in electricity transmission systems. We further argue that the risk models that are developed for this purpose need to take better account of overall societal impact than is captured by traditional measures such as loss of load probability and loss of load expectation; societal impact should take account of frequencies of events with different levels of consequences, distinguishing, for example, between multiple small events and a single large event. This leads to discussion of a “disutility criterion” which has been previously studied in a health and safety context to distinguish between risk aversion and disaster aversion. This approach is new in the context of power systems
OATS : Optimisation and Analysis Toolbox for power Systems
Optimisation and Analysis Toolbox for power Systems analysis (OATS) is an open-source simulation tool for steady-state analyses of power systems problems distributed under the GNU General Public License (GPLv3). It contains implementations of classical steady-state problems, e.g. load flow, optimal power flow (OPF) and unit commitment, as well as enhancements to these classical models relative to the features available in widely used open-source tools. Enhancements implemented in the current release of OATS include: a model of voltage regulating on-load tap-changing transformers; load shedding in OPF; allowing a user to build a contingency list in the security constrained OPF analysis; implementation of a distributed slack bus; and the ability to model zonal transfer limits in unit commitment. The mathematical optimisation models are written in an open-source algebraic modelling language, which offers high-level symbolic syntax for describing optimisation problems. The flexibility offered by OATS makes it an ideal tool for teaching and academic research. This paper presents novel aspects of OATS and discusses, through demonstrative examples, how OATS can be extended to new problem classes in the area of steady-state power systems analysis
- …