11,501 research outputs found
Primary Numbers Database for ATLAS Detector Description Parameters
We present the design and the status of the database for detector description
parameters in ATLAS experiment. The ATLAS Primary Numbers are the parameters
defining the detector geometry and digitization in simulations, as well as
certain reconstruction parameters. Since the detailed ATLAS detector
description needs more than 10,000 such parameters, a preferred solution is to
have a single verified source for all these data. The database stores the data
dictionary for each parameter collection object, providing schema evolution
support for object-based retrieval of parameters. The same Primary Numbers are
served to many different clients accessing the database: the ATLAS software
framework Athena, the Geant3 heritage framework Atlsim, the Geant4 developers
framework FADS/Goofy, the generator of XML output for detector description, and
several end-user clients for interactive data navigation, including web-based
browsers and ROOT. The choice of the MySQL database product for the
implementation provides additional benefits: the Primary Numbers database can
be used on the developers laptop when disconnected (using the MySQL embedded
server technology), with data being updated when the laptop is connected (using
the MySQL database replication).Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 6 pages, 5 figures, pdf. PSN MOKT00
Prototyping Virtual Data Technologies in ATLAS Data Challenge 1 Production
For efficiency of the large production tasks distributed worldwide, it is
essential to provide shared production management tools comprised of
integratable and interoperable services. To enhance the ATLAS DC1 production
toolkit, we introduced and tested a Virtual Data services component. For each
major data transformation step identified in the ATLAS data processing pipeline
(event generation, detector simulation, background pile-up and digitization,
etc) the Virtual Data Cookbook (VDC) catalogue encapsulates the specific data
transformation knowledge and the validated parameters settings that must be
provided before the data transformation invocation. To provide for local-remote
transparency during DC1 production, the VDC database server delivered in a
controlled way both the validated production parameters and the templated
production recipes for thousands of the event generation and detector
simulation jobs around the world, simplifying the production management
solutions.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 5 pages, 3 figures, pdf. PSN TUCP01
ATLAS Data Challenge 1
In 2002 the ATLAS experiment started a series of Data Challenges (DC) of
which the goals are the validation of the Computing Model, of the complete
software suite, of the data model, and to ensure the correctness of the
technical choices to be made. A major feature of the first Data Challenge (DC1)
was the preparation and the deployment of the software required for the
production of large event samples for the High Level Trigger (HLT) and physics
communities, and the production of those samples as a world-wide distributed
activity. The first phase of DC1 was run during summer 2002, and involved 39
institutes in 18 countries. More than 10 million physics events and 30 million
single particle events were fully simulated. Over a period of about 40 calendar
days 71000 CPU-days were used producing 30 Tbytes of data in about 35000
partitions. In the second phase the next processing step was performed with the
participation of 56 institutes in 21 countries (~ 4000 processors used in
parallel). The basic elements of the ATLAS Monte Carlo production system are
described. We also present how the software suite was validated and the
participating sites were certified. These productions were already partly
performed by using different flavours of Grid middleware at ~ 20 sites.Comment: 10 pages; 3 figures; CHEP03 Conference, San Diego; Reference MOCT00
SModelS v1.0: a short user guide
SModelS is a tool for the automatic interpretation of simplified-model
results from the LHC. Version 1.0 of the code is now publicly available. This
document provides a quick user guide for installing and running SModelS v1.0.Comment: The code is available for download at http://smodels.hephy.at
Towards a public analysis database for LHC new physics searches using MadAnalysis 5
We present the implementation, in the MadAnalysis 5 framework, of several
ATLAS and CMS searches for supersymmetry in data recorded during the first run
of the LHC. We provide extensive details on the validation of our
implementations and propose to create a public analysis database within this
framework.Comment: 20 pages, 15 figures, 5 recast codes; version accepted by EPJC (Dec
22, 2014) including a new section with guidelines for the experimental
collaborations as well as for potential contributors to the PAD;
complementary information can be found at
http://madanalysis.irmp.ucl.ac.be/wiki/PhysicsAnalysisDatabas
Sensitivity of IceCube-DeepCore to neutralino dark matter in the MSSM-25
We analyse the sensitivity of IceCube-DeepCore to annihilation of neutralino
dark matter in the solar core, generated within a 25 parameter version of the
minimally supersymmetric standard model (MSSM-25). We explore the
25-dimensional parameter space using scanning methods based on importance
sampling and using DarkSUSY 5.0.6 to calculate observables. Our scans produced
a database of 6.02 million parameter space points with neutralino dark matter
consistent with the relic density implied by WMAP 7-year data, as well as with
accelerator searches. We performed a model exclusion analysis upon these points
using the expected capabilities of the IceCube-DeepCore Neutrino Telescope. We
show that IceCube-DeepCore will be sensitive to a number of models that are not
accessible to direct detection experiments such as SIMPLE, COUPP and XENON100,
indirect detection using Fermi-LAT observations of dwarf spheroidal galaxies,
nor to current LHC searches.Comment: 15 pages, 13 figures. V2: Additional comparisons are made to limits
from Fermi-LAT observations of dwarf spheroidal galaxies and to the 125 GeV
Higgs signal from the LHC. The spectral hardness section has been removed.
Matches version accepted for publication in JCAP. V3: Typos correcte
CheckMATE 2: From the model to the limit
We present the latest developments to the CheckMATE program that allows
models of new physics to be easily tested against the recent LHC data. To
achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of
which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now
integrates the Monte Carlo event generation via Madgraph and Pythia 8. This
allows users to go directly from a SLHA file or UFO model to the result of
whether a model is allowed or not. In addition, the integration of the event
generation leads to a significant increase in the speed of the program. Many
other improvements have also been made, including the possibility to now
combine signal regions to give a total likelihood for a model.Comment: 53 pages, 6 figures; references updated, instructions slightly
change
Measurement of the flavour composition of dijet events in pp collisions at root s=7 TeV with the ATLAS detector
This paper describes a measurement of the flavour composition of dijet events produced in pp collisions at √s=7 TeV using the ATLAS detector. The measurement uses the full 2010 data sample, corresponding to an integrated luminosity of 39Â pbâ1. Six possible combinations of light, charm and bottom jets are identified in the dijet events, where the jet flavour is defined by the presence of bottom, charm or solely light flavour hadrons in the jet. Kinematic variables, based on the properties of displaced decay vertices and optimised for jet flavour identification, are used in a multidimensional template fit to measure the fractions of these dijet flavour states as functions of the leading jet transverse momentum in the range 40Â GeV to 500Â GeV and jet rapidity |y|<2.1. The fit results agree with the predictions of leading- and next-to-leading-order calculations, with the exception of the dijet fraction composed of bottom and light flavour jets, which is underestimated by all models at large transverse jet momenta. The ability to identify jets containing two b-hadrons, originating from e.g. gluon splitting, is demonstrated. The difference between bottom jet production rates in leading and subleading jets is consistent with the next-to-leading-order predictions
- âŠ