193 research outputs found
Propagation of Coherent Light Pulses with PHASE
The current status of the software package PHASE for the propagation of coherent light pulses along a synchrotron radiation beamline is presented. PHASE is based on an asymptotic expansion of the Fresnel Kirchhoff integral stationary phase approximation which is usually truncated at the 2nd order. The limits of this approximation as well as possible extensions to higher orders are discussed. The accuracy is benchmarked against a direct integration of the Fresnel Kirchhoff integral. Long range slope errors of optical elements can be included by means of 8th order polynomials in the optical element coordinates w and l. Only recently, a method for the description of short range slope errors has been implemented. The accuracy of this method is evaluated and examples for realistic slope errors are given. PHASE can be run either from a built in graphical user interface or from any script language. The latter method provides substantial flexibility. Optical elements including apertures can be combined. Complete wave packages can be propagated, as well. Fourier propagators are included in the package, thus, the user may choose between a variety of propagators. Several means to speed up the computation time were tested among them are the parallelization in a multi core environment and the parallelization on a cluste
Recommended from our members
Reliability of regional climate model simulations of extremes and of long-term climate
We present two case studies that demonstrate how a common evaluation methodology can be used to assess the reliability of regional climate model simulations from different fields of research. In Case I, we focused on the agricultural yield loss risk for maize in Northeastern Brazil during a drought linked to an El-Niño event. In Case II, the present-day regional climatic conditions in Europe for a 10-year period are simulated. To comprehensively evaluate the model results for both kinds of investigations, we developed a general methodology. On its basis, we elaborated and implemented modules to assess the quality of model results using both advanced visualization techniques and statistical algorithms. Besides univariate approaches for individual near-surface parameters, we used multivariate statistics to investigate multiple near-surface parameters of interest together. For the latter case, we defined generalized quality measures to quantify the model's accuracy. Furthermore, we elaborated a diagnosis tool applicable for atmospheric variables to assess the model's accuracy in representing the physical processes above the surface under various aspects. By means of this evaluation approach, it could be demonstrated in Case Study I that the accuracy of the applied regional climate model resides at the same level as that we found for another regional model and a global model. Excessive precipitation during the rainy season in coastal regions could be identified as a major contribution leading to this result. In Case Study II, we also identified the accuracy of the investigated mean characteristics for near-surface temperature and precipitation to be comparable to another regional model. In this case, an artificial modulation of the used initial and boundary data during preprocessing could be identified as the major source of error in the simulation. Altogether, the achieved results for the presented investigations indicate the potential of our methodology to be applied as a common test bed to different fields of research in regional climate modeling
Effective theories for real-time correlations in hot plasmas
We discuss the sequence of effective theories needed to understand the
qualitative, and quantitative, behavior of real-time correlators
in ultra-relativistic plasmas. We analyze in detail the case where A is a
gauge-invariant conserved current. This case is of interest because it includes
a correlation recently measured in lattice simulations of classical, hot,
SU(2)-Higgs gauge theory. We find that simple perturbation theory, free kinetic
theory, linearized kinetic theory, and hydrodynamics are all needed to
understand the correlation for different ranges of time. We emphasize how
correlations generically have power-law decays at very large times due to
non-linear couplings to long-lived hydrodynamic modes.Comment: 28 pages, Latex, uses revtex, epsf macro packages [Revised version: t
-> sqrt{t} in a few typos on p. 10.
Bayesian calibration, comparison and averaging of six forest models, using data from Scots pine stands across Europe
Forest management requires prediction of forest growth, but there is no general agreement about which
models best predict growth, how to quantify model parameters, and how to assess the uncertainty of
model predictions. In this paper, we show how Bayesian calibration (BC), Bayesian model comparison
(BMC) and Bayesian model averaging (BMA) can help address these issues.
We used six models, ranging from simple parameter-sparse models to complex process-based models:
3PG, 4C, ANAFORE, BASFOR, BRIDGING and FORMIND. For each model, the initial degree of uncertainty
about parameter values was expressed in a prior probability distribution. Inventory data for Scots pine
on tree height and diameter, with estimates of measurement uncertainty, were assembled for twelve
sites, from four countries: Austria, Belgium, Estonia and Finland. From each country, we used data from
two sites of the National Forest Inventories (NFIs), and one Permanent Sample Plot (PSP). The models
were calibrated using the NFI-data and tested against the PSP-data. Calibration was done both per country
and for all countries simultaneously, thus yielding country-specific and generic parameter distributions.
We assessed model performance by sampling from prior and posterior distributions and
comparing the growth predictions of these samples to the observations at the PSPs.
We found that BC reduced uncertainties strongly in all but the most complex model. Surprisingly,
country-specific BC did not lead to clearly better within-country predictions than generic BC. BMC identified
the BRIDGING model, which is of intermediate complexity, as the most plausible model before calibration,
with 4C taking its place after calibration. In this BMC, model plausibility was quantified as the
relative probability of a model being correct given the information in the PSP-data. We discuss how the
method of model initialisation affects model performance. Finally, we show how BMA affords a robust
way of predicting forest growth that accounts for both parametric and model structural uncertainty
On the Quasiparticle Description of Lattice QCD Thermodynamics
We propose a novel quasiparticle interpretation of the equation of state of
deconfined QCD at finite temperature. Using appropriate thermal masses, we
introduce a phenomenological parametrization of the onset of confinement in the
vicinity of the predicted phase transition. Lattice results of the energy
density, the pressure and the interaction measure of pure SU(3) gauge theory
are excellently reproduced. We find a relationship between the thermal energy
density of the Yang-Mills vacuum and the chromomagnetic condensate _T.
Finally, an extension to QCD with dynamical quarks is discussed. Good agreement
with lattice data for 2, 2+1 and 3 flavour QCD is obtained. We also present the
QCD equation of state for realistic quark masses.Comment: 20 pages, 10 eps figure
The Thermal Renormalization Group for Fermions, Universality, and the Chiral Phase-Transition
We formulate the thermal renormalization group, an implementation of the
Wilsonian RG in the real-time (CTP) formulation of finite temperature field
theory, for fermionic fields. Using a model with scalar and fermionic degrees
of freedom which should describe the two-flavor chiral phase-transition, we
discuss the mechanism behind fermion decoupling and universality at second
order transitions. It turns out that an effective mass-like term in the fermion
propagator which is due to thermal fluctuations and does not break chiral
symmetry is necessary for fermion decoupling to work. This situation is in
contrast to the high-temperature limit, where the dominance of scalar over
fermionic degrees of freedom is due to the different behavior of the
distribution functions. The mass-like contribution is the leading thermal
effect in the fermionic sector and is missed if a derivative expansion of the
fermionic propagator is performed. We also discuss results on the
phase-transition of the model considered where we find good agreement with
results from other methods.Comment: References added, minor typos correcte
Soft Electromagnetic Radiations From Equilibrating Quark-Gluon Plasma
We evaluate the bremsstrahlung production of low mass dileptons and soft
photons from equilibrating and transversely expanding quark gluon plasma which
may be created in the wake of relativistic heavy ion collisions. We use initial
conditions obtained from the self screened parton cascade model. We consider a
boost invariant longitudinal and cylindrically symmetric transverse expansion
of the parton plasma and find that for low mass dileptons ( GeV)
and soft photons ( GeV), the bremsstrahlung contribution is
rather large compared to annihilation process at both RHIC and LHC energies. We
also find an increase by a factor of 15-20 in the low mass dileptons and soft
photons yield as one goes from RHIC to LHC energies.Comment: 8 pages, including 7 figures To appear in Phys. Rev.
Approximately self-consistent resummations for the thermodynamics of the quark-gluon plasma. I. Entropy and density
We propose a gauge-invariant and manifestly UV finite resummation of the
physics of hard thermal/dense loops (HTL/HDL) in the thermodynamics of the
quark-gluon plasma. The starting point is a simple, effectively one-loop
expression for the entropy or the quark density which is derived from the fully
self-consistent two-loop skeleton approximation to the free energy, but subject
to further approximations, whose quality is tested in a scalar toy model. In
contrast to the direct HTL/HDL-resummation of the one-loop free energy, in our
approach both the leading-order (LO) and the next-to-leading order (NLO)
effects of interactions are correctly reproduced and arise from kinematical
regimes where the HTL/HDL are justifiable approximations. The LO effects are
entirely due to the (asymptotic) thermal masses of the hard particles. The NLO
ones receive contributions both from soft excitations, as described by the
HTL/HDL propagators, and from corrections to the dispersion relation of the
hard excitations, as given by HTL/HDL perturbation theory. The numerical
evaluations of our final expressions show very good agreement with lattice data
for zero-density QCD, for temperatures above twice the transition temperature.Comment: 62 pages REVTEX, 14 figures; v2: numerous clarifications, sect. 2C
shortened, new material in sect. 3C; v3: more clarifications, one appendix
removed, alternative implementation of the NLO effects, corrected eq. (5.16
Thermal variational principle and gauge fields
A Feynman-Jensen version of the thermal variational principle is applied to
hot gauge fields, Abelian as well as non-Abelian: scalar electrodynamics
(without scalar self-coupling) and the gluon plasma. The perturbatively known
self-energies are shown to derive by variation from a free quadratic
(''Gaussian'') trial Lagrangian. Independence of the covariant gauge fixing
parameter is reached (within the order studied) after a reformulation of
the partition function such that it depends on only even powers of the gauge
field. Also static properties (Debye screening) are reproduced this way. But
because of the present need to expand the variational functional, the method
falls short of its potential nonperturbative power.Comment: 36 pages, LaTeX, no figures. Updated version: new title, section on
static properties and some references adde
ZyFISH: A Simple, Rapid and Reliable Zygosity Assay for Transgenic Mice
Microinjection of DNA constructs into fertilized mouse oocytes typically results in random transgene integration at a single genomic locus. The resulting transgenic founders can be used to establish hemizygous transgenic mouse lines. However, practical and experimental reasons often require that such lines be bred to homozygosity. Transgene zygosity can be determined by progeny testing assays which are expensive and time-consuming, by quantitative Southern blotting which is labor-intensive, or by quantitative PCR (qPCR) which requires transgene-specific design. Here, we describe a zygosity assessment procedure based on fluorescent in situ hybridization (zyFISH). The zyFISH protocol entails the detection of transgenic loci by FISH and the concomitant assignment of homozygosity using a concise and unbiased scoring system. The method requires small volumes of blood, is scalable to at least 40 determinations per assay, and produces results entirely consistent with the progeny testing assay. This combination of reliability, simplicity and cost-effectiveness makes zyFISH a method of choice for transgenic mouse zygosity determinations
- …