48,627 research outputs found
Transition metal oxides using quantum Monte Carlo
The transition metal-oxygen bond appears prominently throughout chemistry and
solid-state physics. Many materials, from biomolecules to ferroelectrics to the
components of supernova remnants contain this bond in some form. Many of these
materials' properties strongly depend on fine details of the TM-O bond and
intricate correlation effects, which make accurate calculations of their
properties very challenging. We present quantum Monte Carlo, an explicitly
correlated class of methods, to improve the accuracy of electronic structure
calculations over more traditional methods like density functional theory. We
find that unlike s-p type bonding, the amount of hybridization of the d-p bond
in TM-O materials is strongly dependant on electronic correlation.Comment: 20 pages, 4 figures, to appear as a topical review in J. Physics:
Condensed Matte
The NWRA Classification Infrastructure: Description and Extension to the Discriminant Analysis Flare Forecasting System (DAFFS)
A classification infrastructure built upon Discriminant Analysis has been
developed at NorthWest Research Associates for examining the statistical
differences between samples of two known populations. Originating to examine
the physical differences between flare-quiet and flare-imminent solar active
regions, we describe herein some details of the infrastructure including:
parametrization of large datasets, schemes for handling "null" and "bad" data
in multi-parameter analysis, application of non-parametric multi-dimensional
Discriminant Analysis, an extension through Bayes' theorem to probabilistic
classification, and methods invoked for evaluating classifier success. The
classifier infrastructure is applicable to a wide range of scientific questions
in solar physics. We demonstrate its application to the question of
distinguishing flare-imminent from flare-quiet solar active regions, updating
results from the original publications that were based on different data and
much smaller sample sizes. Finally, as a demonstration of "Research to
Operations" efforts in the space-weather forecasting context, we present the
Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time
operationally-running solar flare forecasting tool that was developed from the
research-directed infrastructure.Comment: J. Space Weather Space Climate: Accepted / in press; access
supplementary materials through journal; some figures are less than full
resolution for arXi
Exploring the nucleon structure through GPDs and TDAs in hard exclusive processes
Generalized Parton Distributions (GPDs) offer a new way to access the quark
and gluon nucleon structure. We review recent progress in this domain,
emphasizing the need to supplement the experimental study of deeply virtual
Compton scattering by its crossed version, timelike Compton scattering. We also
describe the extension of the GPD concept to three quark operators and the
relevance of their nucleon to meson matrix elements, namely the transition
distribution amplitudes (TDAs) which factorize in backward meson
electroproduction and related processes. We discuss the main properties of the
TDAs. \Comment: 8 pages; to be published in the proceedings of the conference "PHOTON
2011, International Conference on the Structure and the Interactions of the
Photon ", Spa, Belgium, 22-27 Mai 201
Cosmic-ray induced background intercomparison with actively shielded HPGe detectors at underground locations
The main background above 3\,MeV for in-beam nuclear astrophysics studies
with -ray detectors is caused by cosmic-ray induced secondaries. The
two commonly used suppression methods, active and passive shielding, against
this kind of background were formerly considered only as alternatives in
nuclear astrophysics experiments. In this work the study of the effects of
active shielding against cosmic-ray induced events at a medium deep location is
performed. Background spectra were recorded with two actively shielded HPGe
detectors. The experiment was located at 148\,m below the surface of the Earth
in the Reiche Zeche mine in Freiberg, Germany. The results are compared to data
with the same detectors at the Earth's surface, and at depths of 45\,m and
1400\,m, respectively.Comment: Minor errors corrected; final versio
New results in exclusive hard reactions
Generalized Parton Distributions offer a new way to access the quark and
gluon nucleon structure. We review recent progress in this domain, emphasizing
the need to supplement the experimental study of DVCS by its crossed version,
timelike Compton scattering (TCS), where data at high energy should appear
thanks to the study of ultraperipheral collisions at the LHC. This will open
the access to very low skewness quark and gluon GPDs. Our leading order
estimates show that the factorization scale dependence of the amplitudes is
quite high. This fact demands the understanding of higher order contributions
with the hope that they will stabilize this scale dependence. The magnitudes of
the NLO coefficient functions are not small and neither is the difference of
the coefficient functions appearing respectively in the DVCS and TCS
amplitudes. The conclusion is that extracting the universal GPDs from both TCS
and DVCS reactions requires much care. We also describe the extension of the
GPD concept to three quark operators and the relevance of their nucleon to
meson matrix elements, namely the transition distribution amplitudes (TDAs)
which factorize in hard exclusive pion electroproduction off a nucleon in the
backward region and baryon-antibaryon annihilation into a pion and a lepton
pair. We discuss the main properties of the TDAs.Comment: 4 pages, to be published in the proceedings of the 2011 Europhysics
Conference on High Energy Physics-HEP 2011, July 21-27, 2011, Grenoble,
Rhone-Alpes, Franc
Quantum Analogue Computing
We briefly review what a quantum computer is, what it promises to do for us,
and why it is so hard to build one. Among the first applications anticipated to
bear fruit is quantum simulation of quantum systems. While most quantum
computation is an extension of classical digital computation, quantum
simulation differs fundamentally in how the data is encoded in the quantum
computer. To perform a quantum simulation, the Hilbert space of the system to
be simulated is mapped directly onto the Hilbert space of the (logical) qubits
in the quantum computer. This type of direct correspondence is how data is
encoded in a classical analogue computer. There is no binary encoding, and
increasing precision becomes exponentially costly: an extra bit of precision
doubles the size of the computer. This has important consequences for both the
precision and error correction requirements of quantum simulation, and
significant open questions remain about its practicality. It also means that
the quantum version of analogue computers, continuous variable quantum
computers (CVQC) becomes an equally efficient architecture for quantum
simulation. Lessons from past use of classical analogue computers can help us
to build better quantum simulators in future.Comment: 10 pages, to appear in the Visions 2010 issue of Phil. Trans. Roy.
Soc.
- …