3,621 research outputs found
Steady-state dynamics and effective temperatures of quantum criticality in an open system
We study the thermal and non-thermal steady state scaling functions and the
steady-state dynamics of a model of local quantum criticality. The model we
consider, i.e. the pseudogap Kondo model, allows us to study the concept of
effective temperatures near fully interacting as well as weak-coupling fixed
points. In the vicinity of each fixed point we establish the existence of an
effective temperature --different at each fixed point-- such that the
equilibrium fluctuation-dissipation theorem is recovered. Most notably,
steady-state scaling functions in terms of the effective temperatures coincide
with the equilibrium scaling functions. This result extends to higher
correlation functions as is explicitly demonstrated for the Kondo singlet
strength. The non-linear charge transport is also studied and analyzed in terms
of the effective temperature.Comment: 5 pages, 4 figures; Supplementary Material (7 pages, 1 figure
Pinatubo eruption winter climate effects: Model versus observations
Large volcanic eruptions, in addition to the well-known effect of producing global cooling for a year or two, have been observed to produce shorter-term responses in the climate system involving non-linear dynamical processes. In this paper, we use the ECHAM2 general circulation model forced with stratospheric aerosols to test some of these ideas. Run in a perpetual-January mode, with tropical stratospheric heating from the volcanic aerosols typical of the 1982 El Chichon eruption or the 1991 Pinatubo eruption, we find a dynamical response with an increased polar night jet in the Northern Hemisphere (NH) and stronger zonal winds which extended down into the troposphere. The Azores High shifts northward with increased tropospheric westerlies at 60N and increased easterlies at 30N. Surface temperatures are higher both in northern Eurasia and North America, in agreement with observations for the NH winters or 1982-83 and 1991-92 as well as the winters following the other 10 largest volcanic eruptions since 1883
Coherence of femtosecond single electrons exceeds biomolecular dimensions
Time-resolved diffraction and microscopy with femtosecond electron pulses provide four-dimensional recordings of atomic motion in space and time. However, the limited coherence of electron pulses, reported in the range of 2–3 nm, has so far prevented the study of complex organic molecules with relevance to chemistry and biology. Here we characterize the coherence of femtosecond single-electron pulses that are generated by laser photoemission. We show how the absence of space charge and the minimization of the source size allow the transverse coherence to be extended to 20 nm at the sample position while maintaining a useful beam diameter. The extraordinary coherence is experimentally demonstrated by recording singleelectron diffraction snapshots from a complex organic molecular crystal and identifying more than 80 sharp Bragg reflections. Further optimization affords promise for coherences of 100 nm. These advances will allow time-resolved imaging of functional dynamics in biological systems, uniting picometre and femtosecond resolutions in a compact, table-top instrumentation.publishe
Generalized Parton Distributions of ^3He
A realistic microscopic calculation of the unpolarized quark Generalized
Parton Distribution (GPD) of the nucleus is presented. In
Impulse Approximation, is obtained as a convolution between the GPD of
the internal nucleon and the non-diagonal spectral function, describing
properly Fermi motion and binding effects. The proposed scheme is valid at low
values of , the momentum transfer to the target, the most relevant
kinematical region for the coherent channel of hard exclusive processes. The
obtained formula has the correct forward limit, corresponding to the standard
deep inelastic nuclear parton distributions, and first moment, giving the
charge form factor of . Nuclear effects, evaluated by a modern realistic
potential, are found to be larger than in the forward case. In particular, they
increase with increasing the momentum transfer when the asymmetry of the
process is kept fixed, and they increase with the asymmetry at fixed momentum
transfer. Another relevant feature of the obtained results is that the nuclear
GPD cannot be factorized into a -dependent and a
-independent term, as suggested in prescriptions proposed for finite
nuclei. The size of nuclear effects reaches 8 % even in the most important part
of the kinematical range under scrutiny. The relevance of the obtained results
to study the feasibility of experiments is addressed.Comment: 23 pages, 8 figures; Discussion in section II enlarged; discussion in
section IV shortened. Final version accepted by Phys. Rev.
Double Ionization of Helium by Highly-Charged-Ion Impact Analyzed within the Frozen-Correlation Approximation
We apply the frozen-correlation approximation (FCA) to analyze double ionization of helium by energetic highly charged ions. In this model the double ionization amplitude is represented in terms of single ionization amplitudes, which we evaluate within the continuum distorted wave-eikonal initial state (CDW-EIS) approach. Correlation effects are incorporated in the initial and final states, but are neglected during the time the collision process takes place. We implement the FCA using the Monte Carlo event generator technique, which allows us to generate theoretical event files and to compare theory and experiment using the same analysis tools. The comparison with previous theoretical results and with experimental data demonstrates, on the one hand, the validity of our earlier simple models to account for higher-order mechanisms, and, on the other hand, the robustness of the FCA
Reaction Dynamics in Double Ionization of Helium by Electron Impact
We present theoretical fully differential cross sections (FDCS) for double ionization of helium by 500 eV and 2 keV electron impact. Contributions from various reaction mechanisms to the FDCS were calculated separately and compared to experimental data. Our theoretical methods are based on the first Born approximation. Higher-order effects are incorporated using the Monte Carlo event generator technique. Earlier, we successfully applied this approach to double ionization by ion impact, and in the work reported here it is extended to electron impact. We demonstrate that at 500 eV impact energy, double ionization is dominated by higher-order mechanisms. Even at 2 keV, double ionization does not predominantly proceed through a pure first-order process
Deductive Verification of Unmodified Linux Kernel Library Functions
This paper presents results from the development and evaluation of a
deductive verification benchmark consisting of 26 unmodified Linux kernel
library functions implementing conventional memory and string operations. The
formal contract of the functions was extracted from their source code and was
represented in the form of preconditions and postconditions. The correctness of
23 functions was completely proved using AstraVer toolset, although success for
11 functions was achieved using 2 new specification language constructs.
Another 2 functions were proved after a minor modification of their source
code, while the final one cannot be completely proved using the existing memory
model. The benchmark can be used for the testing and evaluation of deductive
verification tools and as a starting point for verifying other parts of the
Linux kernel.Comment: 18 pages, 2 tables, 6 listings. Accepted to ISoLA 2018 conference.
Evaluating Tools for Software Verification trac
Memory Gaps in the American Time Use Survey. Investigating the Role of Retrieval Cues and Respondents’ Level of Effort
Unaccounted respondent memory gasp- i.e., those activity gaps that are attributed by interviewers to respondents\u27 memory failure- have serious implications for data quality. We contribute to the existing literature by investigating interviewing dynamics using paradata, distinguishing temporary memory gaps, which can be resolved during the interview, from enduring memory gaps, which cannot be resolved. We investigate factors that are associated with both kinds of memory gaps and how different response strategies are associated with data quality. We investigate two hypotheses that are associated with temporary and enduring memory gaps. The motivated cuing hypothesis posits that respondents who display more behaviors related to the presence and use of retrieval cues throughout the survey will resolve temporary memory gaps more successfully compared to respondents displaying fewer such behaviors. This should result in overall lower levels of enduring memory gaps. The lack of effort hypothesis suggests that respondent who are less eager to participate in the survey will expend less cognitive effort to resolve temporary memory gaps compared to more motivated respondents. This should then result in a positive association with enduring memory gaps and no association with temporary memory gaps. Using survey and paradata from the 2010 ATUS, our analyses indicate that, as hypothesized, behaviors indicating the use of retrieval cues are positively associated with temporary memory gaps and negatively associated with enduring memory gaps. Motivated respondents experiencing memory difficulties overcome what otherwise would result in enduring memory gaps more successfully compared to other respondents. Indicators of lack of effort, such as whether or not the respondent initially refused to participate in the survey, are positively associated with enduring memory gaps suggesting that reluctant respondents do not resolve memory gaps. The paper concludes with a discussion of implications for survey research
- …