2,617 research outputs found
CMS Data Analysis: Current Status and Future Strategy
We present the current status of CMS data analysis architecture and describe
work on future Grid-based distributed analysis prototypes. CMS has two main
software frameworks related to data analysis: COBRA, the main framework, and
IGUANA, the interactive visualisation framework. Software using these
frameworks is used today in the world-wide production and analysis of CMS data.
We describe their overall design and present examples of their current use with
emphasis on interactive analysis. CMS is currently developing remote analysis
prototypes, including one based on Clarens, a Grid-enabled client-server tool.
Use of the prototypes by CMS physicists will guide us in forming a
Grid-enriched analysis strategy. The status of this work is presented, as is an
outline of how we plan to leverage the power of our existing frameworks in the
migration of CMS software to the Grid.Comment: 4 pages, 3 figures, contribution to CHEP`03 conferenc
Heterogeneous reconstruction of tracks and primary vertices with the CMS pixel tracker
The High-Luminosity upgrade of the LHC will see the accelerator reach an
instantaneous luminosity of with an average
pileup of proton-proton collisions. These conditions will pose an
unprecedented challenge to the online and offline reconstruction software
developed by the experiments. The computational complexity will exceed by far
the expected increase in processing power for conventional CPUs, demanding an
alternative approach. Industry and High-Performance Computing (HPC) centres are
successfully using heterogeneous computing platforms to achieve higher
throughput and better energy efficiency by matching each job to the most
appropriate architecture. In this paper we will describe the results of a
heterogeneous implementation of pixel tracks and vertices reconstruction chain
on Graphics Processing Units (GPUs). The framework has been designed and
developed to be integrated in the CMS reconstruction software, CMSSW. The speed
up achieved by leveraging GPUs allows for more complex algorithms to be
executed, obtaining better physics output and a higher throughput
First experience in operating the population of the condition databases for the CMS experiment
Reliable population of the condition databases is critical for the correct
operation of the online selection as well as of the offline reconstruction and
analysis of data. We will describe here the system put in place in the CMS
experiment to populate the database and make condition data promptly available
both online for the high-level trigger and offline for reconstruction. The
system, designed for high flexibility to cope with very different data sources,
uses POOL-ORA technology in order to store data in an object format that best
matches the object oriented paradigm for \texttt{C++} programming language used
in the CMS offline software. In order to ensure consistency among the various
subdetectors, a dedicated package, PopCon (Populator of Condition Objects), is
used to store data online. The data are then automatically streamed to the
offline database hence immediately accessible offline worldwide. This mechanism
was intensively used during 2008 in the test-runs with cosmic rays. The
experience of this first months of operation will be discussed in detail.Comment: 15 pages, submitter to JOP, CHEP0
Long-Term Outcome of Patients with Complete Pathologic Response after Neoadjuvant Chemoradiation for cT3 Rectal Cancer: Implications for Local Excision Surgical Strategies
Neoadjuvant chemoradiotherapy (CRT) followed by radical surgery including total mesorectal excision (TME) is standard treatment in patients with locally advanced rectal cancer. Emerging data indicate that patients with complete pathologic response (ypCR) after CRT have favorable outcome, suggesting the possibility of less invasive surgical treatment. We analyzed long-term outcome of cT3 rectal cancer treated by neoadjuvant CRT in relation to ypCR and type of surgery. The study population comprised 139 patients (93 men, 46 women; median age 62 years) with cT3N0-1M0 mid and distal rectal adenocarcinoma treated by CRT and surgery (110 TME and 29 local excision) at our institution between 1996 and 2008. At pathology, ypCR was defined as no residual cancer cells in the primary tumor. Tumors of 42 patients (30.2%) were classified as ypCR. After a median follow-up of 55.4 months, comparing patients with ypCR to patients with no ypCR, 5-year disease-specific survival was 95.8% versus 78.0% (P = 0.004), and 5-year disease-free survival was 90.1% vs. 64.0% (P = 0.004). In patients with ypCR, no statistically significant outcome difference was observed between TME and local excision. In patients treated by local excision, comparing patients with ypCR to patients with no ypCR, 5-year disease-free survival was 100% vs. 65.5% (P = 0.024), and 5-year local recurrence-free survival was 92.9% vs. 66.7% (P = 0.047). With retrospective analysis limitations, our data confirm favorable long-term outcome of cT3 rectal cancer with ypCR after CRT and warrant clinical trials exploring local excision surgical strategies
TrackML high-energy physics tracking challenge on Kaggle
The High-Luminosity LHC (HL-LHC) is expected to reach unprecedented collision intensities, which in turn will greatly increase the complexity of tracking within the event reconstruction. To reach out to computer science specialists, a tracking machine learning challenge (TrackML) was set up on Kaggle by a team of ATLAS, CMS, and LHCb physicists tracking experts and computer scientists building on the experience of the successful Higgs Machine Learning challenge in 2014. A training dataset based on a simulation of a generic HL-LHC experiment tracker has been created, listing for each event the measured 3D points, and the list of 3D points associated to a true track.The participants to the challenge should find the tracks in the test dataset, which means building the list of 3D points belonging to each track.The emphasis is to expose innovative approaches, rather than hyper-optimising known approaches. A metric reflecting the accuracy of a model at finding the proper associations that matter most to physics analysis will allow to select good candidates to augment or replace existing algorithms
Track reconstruction at LHC as a collaborative data challenge use case with RAMP
Charged particle track reconstruction is a major component of data-processing in high-energy physics experiments such as those at the Large Hadron Collider (LHC), and is foreseen to become more and more challenging with higher collision rates. A simplified two-dimensional version of the track reconstruction problem is set up on a collaborative platform, RAMP, in order for the developers to prototype and test new ideas. A small-scale competition was held during the Connecting The Dots / Intelligent Trackers 2017 (CTDWIT 2017) workshop. Despite the short time scale, a number of different approaches have been developed and compared along a single score metric, which was kept generic enough to accommodate a summarized performance in terms of both efficiency and fake rates
HEPScore: A new CPU benchmark for the WLCG
HEPScore is a new CPU benchmark created to replace the HEPSPEC06 benchmark
that is currently used by the WLCG for procurement, computing resource pledges
and performance studies. The development of the new benchmark, based on HEP
applications or workloads, has involved many contributions from software
developers, data analysts, experts of the experiments, representatives of
several WLCG computing centres, as well as the WLCG HEPScore Deployment Task
Force. In this contribution, we review the selection of workloads and the
validation of the new HEPScore benchmark.Comment: Paper submitted to the proceedings of the Computing in HEP Conference
2023, Norfol
Performance of the CMS Cathode Strip Chambers with Cosmic Rays
The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device
in the CMS endcaps. Their performance has been evaluated using data taken
during a cosmic ray run in fall 2008. Measured noise levels are low, with the
number of noisy channels well below 1%. Coordinate resolution was measured for
all types of chambers, and fall in the range 47 microns to 243 microns. The
efficiencies for local charged track triggers, for hit and for segments
reconstruction were measured, and are above 99%. The timing resolution per
layer is approximately 5 ns
- …