4,014 research outputs found
Heavy-Quarkonium Production in High Energy Proton-Proton Collisions at RHIC
We update the study of the total Psi and Upsilon production cross section in
proton-proton collisions at RHIC energies using the QCD-based Color-Singlet
(CS) Model, including next-to-leading order partonic matrix elements. We also
include charm-quark initiated processes which appear at leading order in
alpha_s, but which have so far been overlooked in such studies. Contrary to
earlier claims, we show that the CS yield is consistent with measurements over
a broad range of J/Psi rapidities. We also find that charm-quark initiated
processes, including both intrinsic and sea-like charm components, typically
contribute at least 20% of the direct J/Psi yield, improving the agreement with
data both for the integrated cross section and its rapidity dependence. The key
signature for such processes is the observation of a charm-quark jet opposite
in azimuthal angle phi to the detected J/Psi. Our results have impact on the
proper interpretation of heavy-quarkonium production in heavy-ion collisions
and its use as a probe for the quark-gluon plasma.Comment: 5 pages, 11 figures, LaTeX, version to appear as a Rapid
Communication in Phys. Rev.
Book Reviews
Review of the following books: Chez Nous: The St. John Valley by Guy F. Dubay; Stagecoach East: Stagecoach Days in the East from the Colonial Period to the Civil War by Oliver W. Holmes and Peter Rohrbach; Coastal New England by William F. Robinson
The absorption principle and E-type anaphora
The Absorption Principle is a principle of situation theory which restricts the kinds of parametric information which is available. In particular it rules out abstraction over variable occurrences in parametric restrictions (unless the parameter itself is included). In "Anaphora and Quantification in Situation Semantics", Gawron and Peters showed that the Absorption Principle has intuitively correct consequences in applications to quantificational and anaphoric semantics, but Sem, Saebo, Verne and Vestre (1990) point out cases of incorrect consequences. The present paper provides an analysis of the problematic cases in which the Absorption Principle is maintained. A key part of the analysis is the postulation that anaphors may have quantified NPs as antecedents, a position which has been vigorously advocated by Evans (1980). As a consequence, anaphors of this type are called E-Type\u27. We argue that the pronoun it\u27 in the following discourse must be analyzed as E-Type:
Tom has exactly one car. It is red. We provide an analysis of E-Type anaphora with the following properties: (i) the type of the anaphor is derived from the conservative scope of its antecedent; (ii) its semantics is provided by a choice function; and (iii) there is a pragmatic condition that the choice function not be controlled either by speaker or hearer in the discourse. We demonstrate how this accounts for a wide range of facts, including apparently varying quantificational force
High-Resolution Fourier-Transform Ultraviolet-Visible Spectrometer for the Measurement of Atmospheric Trace Species: Application to OH
A compact, high-resolution Fourier-transform spectrometer for atmospheric near-ultraviolet spectroscopy has been installed at the Jet Propulsion Laboratory s Table 1 Mountain Facility (34.4 N, 117.7 W, elevation 2290 m). This instrument is designed with an unapodized resolving power near 500,000 at 300 nm to provide high-resolution spectra from 290 to 675 nm for the quantification of column abundances of trace atmospheric species. The measurement technique used is spectral analysis of molecular absorptions of solar radiation. The instrument, accompanying systems designs, and results of the atmospheric hydroxyl column observations are described
LineStacker: A spectral line stacking tool for interferometric data
LineStacker is a new open access and open source tool for stacking of
spectral lines in interferometric data. LineStacker is an ensemble of CASA
tasks, and can stack both 3D cubes or already extracted spectra. The algorithm
is tested on increasingly complex simulated data sets, mimicking Atacama Large
Millimeter/submillimeter Array and Karl G. Jansky Very Large Array observations
of [CII] and CO(3-2) emission lines, from and galaxies
respectively. We find that the algorithm is very robust, successfully
retrieving the input parameters of the stacked lines in all cases with an
accuracy \%. However, we distinguish some specific situations
showcasing the intrinsic limitations of the method. Mainly that high
uncertainties on the redshifts () can lead to poor signal to
noise ratio improvement, due to lines being stacked on shifted central
frequencies. Additionally we give an extensive description of the embedded
statistical tools included in LineStacker: mainly bootstrapping, rebinning and
subsampling. Velocity rebinning {is applied on the data before stacking and}
proves necessary when studying line profiles, in order to avoid artificial
spectral features in the stack. Subsampling is useful to sort the stacked
sources, allowing to find a subsample maximizing the searched parameters, while
bootstrapping allows to detect inhomogeneities in the stacked sample.
LineStacker is a useful tool for extracting the most from spectral observations
of various types.Comment: Resubmitted to MNRAS after referee repor
Learning spatio-temporal trajectories from manifold-valued longitudinal data
International audienceWe propose a Bayesian mixed-effects model to learn typical scenarios of changesfrom longitudinal manifold-valued data, namely repeated measurements of thesame objects or individuals at several points in time. The model allows to estimatea group-average trajectory in the space of measurements. Random variations ofthis trajectory result from spatiotemporal transformations, which allow changes inthe direction of the trajectory and in the pace at which trajectories are followed.The use of the tools of Riemannian geometry allows to derive a generic algorithmfor any kind of data with smooth constraints, which lie therefore on a Riemannianmanifold. Stochastic approximations of the Expectation-Maximization algorithmis used to estimate the model parameters in this highly non-linear setting. Themethod is used to estimate a data-driven model of the progressive impairments ofcognitive functions during the onset of Alzheimer’s disease. Experimental resultsshow that the model correctly put into correspondence the age at which each in-dividual was diagnosed with the disease, thus validating the fact that it effectivelyestimated a normative scenario of disease progression. Random effects provideunique insights into the variations in the ordering and timing of the succession ofcognitive impairments across different individuals
- …