4,007 research outputs found
A Supercooled Spin Liquid State in the Frustrated Pyrochlore Dy2Ti2O7
A "supercooled" liquid develops when a fluid does not crystallize upon
cooling below its ordering temperature. Instead, the microscopic relaxation
times diverge so rapidly that, upon further cooling, equilibration eventually
becomes impossible and glass formation occurs. Classic supercooled liquids
exhibit specific identifiers including microscopic relaxation times diverging
on a Vogel-Tammann-Fulcher (VTF) trajectory, a Havriliak-Negami (HN) form for
the dielectric function, and a general Kohlrausch-Williams-Watts (KWW) form for
time-domain relaxation. Recently, the pyrochlore Dy2Ti2O7 has become of
interest because its frustrated magnetic interactions may, in theory, lead to
highly exotic magnetic fluids. However, its true magnetic state at low
temperatures has proven very difficult to identify unambiguously. Here we
introduce high-precision, boundary-free magnetization transport techniques
based upon toroidal geometries and gain a fundamentally new understanding of
the time- and frequency-dependent magnetization dynamics of Dy2Ti2O7. We
demonstrate a virtually universal HN form for the magnetic susceptibility, a
general KWW form for the real-time magnetic relaxation, and a divergence of the
microscopic magnetic relaxation rates with precisely the VTF trajectory. Low
temperature Dy2Ti2O7 therefore exhibits the characteristics of a supercooled
magnetic liquid; the consequent implication is that this translationally
invariant lattice of strongly correlated spins is evolving towards an
unprecedented magnetic glass state, perhaps due to many-body localization of
spin.Comment: Version 2 updates: added legend for data in Figures 4A and 4B;
corrected equation reference in caption for Figure 4
Enhanced and continuous electrostatic carrier doping on the SrTiO surface
Paraelectrical tuning of a charge carrier density as high as
10\,cm in the presence of a high electronic carrier mobility on
the delicate surfaces of correlated oxides, is a key to the technological
breakthrough of a field effect transistor (FET) utilising the metal-nonmetal
transition. Here we introduce the Parylene-C/TaO hybrid gate
insulator and fabricate FET devices on single-crystalline SrTiO, which
has been regarded as a bedrock material for oxide electronics. The gate
insulator accumulates up to cm carriers, while the
field-effect mobility is kept at 10\,cm/Vs even at room temperature.
Further to the exceptional performance of our devices, the enhanced
compatibility of high carrier density and high mobility revealed the mechanism
for the long standing puzzle of the distribution of electrostatically doped
carriers on the surface of SrTiO. Namely, the formation and continuous
evolution of field domains and current filaments.Comment: Supplementary Information:
<http://www.nature.com/srep/2013/130424/srep01721/extref/srep01721-s1.pdf
Statistical mechanics of budget-constrained auctions
Finding the optimal assignment in budget-constrained auctions is a
combinatorial optimization problem with many important applications, a notable
example being the sale of advertisement space by search engines (in this
context the problem is often referred to as the off-line AdWords problem).
Based on the cavity method of statistical mechanics, we introduce a message
passing algorithm that is capable of solving efficiently random instances of
the problem extracted from a natural distribution, and we derive from its
properties the phase diagram of the problem. As the control parameter (average
value of the budgets) is varied, we find two phase transitions delimiting a
region in which long-range correlations arise.Comment: Minor revisio
Nonfatal reinfarction as an independent riskfactor for subsequent mortality in post-myocardial infarction patients
Pericas, Enric;Ordóñez, Estrell
Global Ultrasound Elastography Using Convolutional Neural Network
Displacement estimation is very important in ultrasound elastography and
failing to estimate displacement correctly results in failure in generating
strain images. As conventional ultrasound elastography techniques suffer from
decorrelation noise, they are prone to fail in estimating displacement between
echo signals obtained during tissue distortions. This study proposes a novel
elastography technique which addresses the decorrelation in estimating
displacement field. We call our method GLUENet (GLobal Ultrasound Elastography
Network) which uses deep Convolutional Neural Network (CNN) to get a coarse
time-delay estimation between two ultrasound images. This displacement is later
used for formulating a nonlinear cost function which incorporates similarity of
RF data intensity and prior information of estimated displacement. By
optimizing this cost function, we calculate the finer displacement by
exploiting all the information of all the samples of RF data simultaneously.
The Contrast to Noise Ratio (CNR) and Signal to Noise Ratio (SNR) of the strain
images from our technique is very much close to that of strain images from
GLUE. While most elastography algorithms are sensitive to parameter tuning, our
robust algorithm is substantially less sensitive to parameter tuning.Comment: 4 pages, 4 figures; added acknowledgment section, submission type
late
Managing Climate Risk
At the heart of the traditional approach to strategy in the climate change dilemma lies the assumption that the global community, by applying a set of powerful analytical tools, can predict the future of climate change accurately enough to choose a clear strategic direction for it. We claim that this approach might involve underestimating uncertainty in order to lay out a vision of future events sufficiently precise to be captured in a discounted cost flow analysis in integrated assessment models. However, since the future of climate change is truly uncertain, this approach might at best be marginally helpful and at worst downright dangerous: underestimating uncertainty can lead to strategies that do not defend the world against unexpected and sometimes even catastrophic threats. Another danger lies on the other extreme: if the global community can not find a strategy that works under traditional analysis or if uncertainties are too large that clear messages are absent, they may abandon the analytical rigor of their planning process altogether and base their decisions on good instinct and consensus of some future process that is easy to agree upon.
In this paper, we try to outline a system to derive strategic decisions under uncertainty for the climate change dilemma. What follows is a framework for determining the level of uncertainty surrounding strategic decisions and for tailoring strategy to that uncertainty.
Our core argument is that a robust strategy towards climate change involves the building of a technological portfolio of mitigation and adaptation measures that includes sufficient opposite technological positions to the underlying baseline emission scenarios given the uncertainties of the entire physical and socioeconomic system in place. In the case of mitigation, opposite technological positions with the highest leverage are particular types of sinks. A robust climate risk management portfolio can only work when the opposite technological positions are readily available when needed and therefore have to be prepared in advance. It is precisely the flexibility of these technological options which has to be quantified under the perspective of the uncertain nature of the underlying system and compared to the cost of creating these options, rather than comparing their cost with expected losses in a net present value type analysis. We conclude that climate policy - especially under the consideration of the precautionary principle - would look much different if uncertainties would be taken explicitly into account
Recommended from our members
Why Are People's Decisions Sometimes Worse with Computer Support?
In many applications of computerised decision support, a recognised source of undesired outcomes is operators' apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like "over-reliance" betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently "over-reliant" behaviour). We review relevant literature in the area of "automation bias" and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using "alerting systems", with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature
- …
