1,806 research outputs found
Triggering on hard probes in heavy ion collisions with CMS
We present a study of the CMS trigger system in heavy-ion collisions.
Concentrating on two physics channels, dimuons from decays of quarkonia and
single jets, we evaluate a possible trigger strategy for Pb+Pb running that
relies on event selection solely in the High-Level Trigger (HLT). The study is
based on measurements of the timing performance of the offline algorithms and
event-size distributions using full simulations. Using a trigger simulation
chain, we compare the physics reach for the jet and dimuon channels using
online selection in the HLT to minimum bias running. The results demonstrate
the crucial role the HLT will play for CMS heavy-ion physics.Comment: 4 pages, 4 fugures, contribution to QM'06 conferenc
Medium information from anisotropic flow and jet quenching in relativistic heavy ion collisions
Within a multiphase transport (AMPT) model, where the initial conditions are
obtained from the recently updated HIJING 2.0 model, the recent anisotropic
flow and suppression data for charged hadrons in Pb+Pb collisions at the LHC
center of mass energy of 2.76 TeV are explored to constrain the properties of
the partonic medium formed. In contrast to RHIC, the measured centrality
dependence of charged hadron multiplicity dN_ch/deta at LHC provides severe
constraint to the largely uncertain gluon shadowing parameter s_g. We find
final-state parton scatterings reduce considerably hadron yield at midrapidity
and enforces a smaller s_g to be consistent with dN_ch/deta data at LHC. With
the parton shadowing so constrained, hadron production and flow over a wide
transverse momenta range are investigated in AMPT. The model calculations for
the elliptic and triangular flow are found to be in excellent agreement with
the RHIC data, and predictions for the flow coefficients v_n(p_T, cent) at LHC
are given. The magnitude and pattern of suppression of the hadrons in AMPT are
found consistent with the measurements at RHIC. However, the suppression is
distinctly overpredicted in Pb+Pb collisions at the LHC energy. Reduction of
the QCD coupling constant alpha_s by ~30% in the higher temperature plasma
formed at LHC reproduces the measured hadron suppression.Comment: Talk given by Subrata Pal at the 11th International Conference on
Nucleus-Nucleus Collisions (NN2012), San Antonio, Texas, USA, May 27-June 1,
2012. To appear in the NN2012 Proceedings in Journal of Physics: Conference
Series (JPCS
RooStatsCms: a tool for analyses modelling, combination and statistical studies
The RooStatsCms (RSC) software framework allows analysis modelling and
combination, statistical studies together with the access to sophisticated
graphics routines for results visualisation. The goal of the project is to
complement the existing analyses by means of their combination and accurate
statistical studies.Comment: Proceedings of the 11th Topical Seminar on Innovative Particle and
Radiation Detectors. 4 pages and 5 figure
Toward particle-level filtering of individual collision events at the Large Hadron Collider and beyond
Low-energy strong interactions are a major source of background at hadron colliders, and methods of subtracting the associated energy flow are well established in the field. Traditional approaches treat the contamination as diffuse, and estimate background energy levels either by averaging over large data sets or by restricting to given kinematic regions inside individual collision events. On the other hand, more recent techniques take into account the discrete nature of background, most notably by exploiting the presence of substructure inside hard jets, i.e. inside collections of particles originating from scattered hard quarks and gluons. However, none of the existing methods subtract background at the level of individual particles inside events. We illustrate the use of an algorithm that will allow particle-by-particle background discrimination at the Large Hadron Collider, and we envisage this as the basis for a novel event filtering procedure upstream of the official reconstruction chains. Our hope is that this new technique will improve physics analysis when used in combination with state-of-the-art algorithms in high-luminosity hadron collider environments
First experience in operating the population of the condition databases for the CMS experiment
Reliable population of the condition databases is critical for the correct
operation of the online selection as well as of the offline reconstruction and
analysis of data. We will describe here the system put in place in the CMS
experiment to populate the database and make condition data promptly available
both online for the high-level trigger and offline for reconstruction. The
system, designed for high flexibility to cope with very different data sources,
uses POOL-ORA technology in order to store data in an object format that best
matches the object oriented paradigm for \texttt{C++} programming language used
in the CMS offline software. In order to ensure consistency among the various
subdetectors, a dedicated package, PopCon (Populator of Condition Objects), is
used to store data online. The data are then automatically streamed to the
offline database hence immediately accessible offline worldwide. This mechanism
was intensively used during 2008 in the test-runs with cosmic rays. The
experience of this first months of operation will be discussed in detail.Comment: 15 pages, submitter to JOP, CHEP0
- …