336 research outputs found

    Experimental Tests of Particle Flow Calorimetry

    Get PDF
    Precision physics at future colliders requires highly granular calorimeters to support the Particle Flow Approach for event reconstruction. This article presents a review of about 10 - 15 years of R\&D, mainly conducted within the CALICE collaboration, for this novel type of detector. The performance of large scale prototypes in beam tests validate the technical concept of particle flow calorimeters. The comparison of test beam data with simulation, of e.g.\ hadronic showers, supports full detector studies and gives deeper insight into the structure of hadronic cascades than was possible previously.Comment: 55 pages, 83 figures, to appear in Reviews of Modern physic

    Collaborative yet independent: Information practices in the physical sciences

    Get PDF
    In many ways, the physical sciences are at the forefront of using digital tools and methods to work with information and data. However, the fields and disciplines that make up the physical sciences are by no means uniform, and physical scientists find, use, and disseminate information in a variety of ways. This report examines information practices in the physical sciences across seven cases, and demonstrates the richly varied ways in which physical scientists work, collaborate, and share information and data. This report details seven case studies in the physical sciences. For each case, qualitative interviews and focus groups were used to understand the domain. Quantitative data gathered from a survey of participants highlights different information strategies employed across the cases, and identifies important software used for research. Finally, conclusions from across the cases are drawn, and recommendations are made. This report is the third in a series commissioned by the Research Information Network (RIN), each looking at information practices in a specific domain (life sciences, humanities, and physical sciences). The aim is to understand how researchers within a range of disciplines find and use information, and in particular how that has changed with the introduction of new technologies

    Particle Physics and Cosmology

    Full text link
    Today, both particle physics and cosmology are described by few parameter Standard Models, i.e. it is possible to deduce consequence of particle physics in cosmology and vice verse. The former is examined in this lecture, in light of the recent systematic exploration of the electroweak scale by the LHC experiments. The two main results of the first phase of the LHC, the discovery of a Higgs-like particle and the absence so far of new particles predicted by "natural" theories beyond the Standard Model (supersymmetry, extra-dimension and composite Higgs) are put in a historical context to enlighten their importance and then presented extensively. To be complete, a short review from the neutrino physics, which can not be probed at LHC, is also given. The ability of all these results to resolve the 3 fundamental questions of cosmology about the nature of dark energy and dark matter as well as the origin of matter-antimatter asymmetry is discussed in each case.Comment: 32 pages, 47 figures, Proceeding from the 100th Les Houches Summer School on Post-Planck Cosmology, July 8th - Aug 2nd 2013. Update with recently published ATLAS/CMS 8 TeV result

    Distributed computing and farm management with application to the search for heavy gauge bosons using the ATLAS experiment at the LHC (CERN)

    Get PDF
    The Standard Model of particle physics describes the strong, weak, and electromagnetic forces between the fundamental particles of ordinary matter. However, it presents several problems and some questions remain unanswered so it cannot be considered a complete theory of fundamental interactions. Many extensions have been proposed in order to address these problems. Some important recent extensions are the Extra Dimensions theories. In the context of some models with Extra Dimensions of size about 1TeV11 TeV^{-}1, in particular in the ADD model with only fermions confined to a D-brane, heavy Kaluza-Klein excitations are expected, with the same properties as SM gauge bosons but more massive. In this work, three hadronic decay modes of some of such massive gauge bosons, Z* and W*, are investigated using the ATLAS experiment at the Large Hadron Collider (LHC), presently under construction at CERN. These hadronic modes are more difficult to detect than the leptonic ones, but they should allow a measurement of the couplings between heavy gauge bosons and quarks. The events were generated using the ATLAS fast simulation and reconstruction MC program Atlfast coupled to the Monte Carlo generator PYTHIA. We found that for an integrated luminosity of 3×105pb13 × 10^{5} pb^{-}1 and a heavy gauge boson mass of 2 TeV, the channels Z*->bb and Z*->tt would be difficult to detect because the signal would be very small compared with the expected backgrou nd, although the significance in the case of Z*->tt is larger. In the channel W*->tb , the decay might yield a signal separable from the background and a significance larger than 5 so we conclude that it would be possible to detect this particular mode at the LHC. The analysis was also performed for masses of 1 TeV and we conclude that the observability decreases with the mass. In particular, a significance higher than 5 may be achieved below approximately 1.4, 1.9 and 2.2 TeV for Z*->bb , Z*->tt and W*->tb respectively. The LHC will start to operate in 2008 and collect data in 2009. It will produce roughly 15 Petabytes of data per year. Access to this experimental data has to be provided for some 5,000 scientists working in 500 research institutes and universities. In addition, all data need to be available over the estimated 15-year lifetime of the LHC. The analysis of the data, including comparison with theoretical simulations, requires an enormous computing power. The computing challenges that scientists have to face are the huge amount of data, calculations to perform and collaborators. The Grid has been proposed as a solution for those challenges. The LHC Computing Grid project (LCG) is the Grid used by ATLAS and the other LHC experiments and it is analised in depth with the aim of studying the possible complementary use of it with another Grid project. That is the Berkeley Open Infrastructure for Network C omputing middle-ware (BOINC) developed for the SETI@home project, a Grid specialised in high CPU requirements and in using volunteer computing resources. Several important packages of physics software used by ATLAS and other LHC experiments have been successfully adapted/ported to be used with this platform with the aim of integrating them into the LHC@home project at CERN: Atlfast, PYTHIA, Geant4 and Garfield. The events used in our physics analysis with Atlfast were reproduced using BOINC obtaining exactly the same results. The LCG software, in particular SEAL, ROOT and the external software, was ported to the Solaris/sparc platform to study it's portability in general as well. A testbed was performed including a big number of heterogeneous hardware and software that involves a farm of 100 computers at CERN's computing center (lxboinc) together with 30 PCs from CIEMAT and 45 from schools from Extremadura (Spain). That required a preliminary study, development and creation of components of the Quattor software and configuration management tool to install and manage the lxboinc farm and it also involved the set up of a collaboration between the Spanish research centers and government and CERN. The testbed was successful and 26,597 Grid jobs were delivered, executed and received successfully. We conclude that BOINC and LCG are complementary and useful kinds of Grid that can be used by ATLAS and the other LHC experiments. LCG has very good data distribution, management and storage capabilities that BOINC does not have. In the other hand, BOINC does not need high bandwidth or Internet speed and it also can provide a huge and inexpensive amount of computing power coming from volunteers. In addition, it is possible to send jobs from LCG to BOINC and vice versa. So, possible complementary cases are to use volunteer BOINC nodes when the LCG nodes have too many jobs to do or to use BOINC for high CPU tasks like event generators or reconstructions while concentrating LCG for data analysis

    EGI user forum 2011 : book of abstracts

    Get PDF

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Top-squark pair production at the LHC: a complete analysis at next-to-leading order

    Get PDF
    We present a complete next-to-leading order study of top-squark pair production at the LHC, including QCD and EW corrections. The calculation is performed within the Minimal Supersymmetric Standard Model and numerical results are presented for parameter regions compatible with the observed Higgs boson. We employ the most recent parton distribution functions including QED corrections and we find NLO EW corrections to the inclusive stop-pair production cross section up to 2530%25 - 30\% compared to the leading-order prediction. Besides corrections to inclusive cross sections also important kinematic distributions are investigated.Comment: 27 pages, 10 figures. Version published in JHEP. The numerical discussion in Section 3 has been extended. References have been adde

    Trigger and Timing Distributions using the TTC-PON and GBT Bridge Connection in ALICE for the LHC Run 3 Upgrade

    Full text link
    The ALICE experiment at CERN is preparing for a major upgrade for the third phase of data taking run (Run 3), when the high luminosity phase of the Large Hadron Collider (LHC) starts. The increase in the beam luminosity will result in high interaction rate causing the data acquisition rate to exceed 3 TB/sec. In order to acquire data for all the events and to handle the increased data rate, a transition in the readout electronics architecture from the triggered to the trigger-less acquisition mode is required. In this new architecture, a dedicated electronics block called the Common Readout Unit (CRU) is defined to act as a nodal communication point for detector data aggregation and as a distribution point for timing, trigger and control (TTC) information. TTC information in the upgraded triggerless readout architecture uses two asynchronous high-speed serial links connections: the TTC-PON and the GBT. We have carried out a study to evaluate the quality of the embedded timing signals forwarded by the CRU to the connected electronics using the TTC-PON and GBT bridge connection. We have used four performance metrics to characterize the communication bridge: (a)the latency added by the firmware logic, (b)the jitter cleaning effect of the PLL on the timing signal, (c)BER analysis for quantitative measurement of signal quality, and (d)the effect of optical transceivers parameter settings on the signal strength. Reliability study of the bridge connection in maintaining the phase consistency of timing signals is conducted by performing multiple iterations of power on/off cycle, firmware upgrade and reset assertion/de-assertion cycle (PFR cycle). The test results are presented and discussed concerning the performance of the TTC-PON and GBT bridge communication chain using the CRU prototype and its compliance with the ALICE timing requirements

    The Hierarchy Solution to the LHC Inverse Problem

    Full text link
    Supersymmetric (SUSY) models, even those described by relatively few parameters, generically allow many possible SUSY particle (sparticle) mass hierarchies. As the sparticle mass hierarchy determines, to a great extent, the collider phenomenology of a model, the enumeration of these hierarchies is of the utmost importance. We therefore provide a readily generalizable procedure for determining the number of sparticle mass hierarchies in a given SUSY model. As an application, we analyze the gravity-mediated SUSY breaking scenario with various combinations of GUT-scale boundary conditions involving different levels of universality among the gaugino and scalar masses. For each of the eight considered models, we provide the complete list of forbidden hierarchies in a compact form. Our main result is that the complete (typically rather large) set of forbidden hierarchies among the eight sparticles considered in this analysis can be fully specified by just a few forbidden relations involving much smaller subsets of sparticles.Comment: 44 pages, 2 figures. Python code providing lists of allowed and forbidden hierarchy is included in ancillary file

    Search for Supersymmetry in Trilepton Final States with the ATLAS Detector and the Alignment of the ATLAS Silicon Tracker

    Get PDF
    One of the main goals of the ATLAS detector at the LHC of CERN, a proton-proton collider with a nominal centre-of-mass energy of 14 TeV, is to search for New Physics beyond the Standard Model (BSM). A widely favoured BSM candidate is Supersymmetry (SUSY), which postulates a superpartner for each Standard Model particle. The first part of this thesis describes a strategy for an early discovery of SUSY using the trilepton signature, with a focus on gravity-mediated SUSY breaking, mSUGRA. The discovery potential for SUSY for the case where strongly interacting supersymmetric particles are very massive is critically investigated. A possible choice of triggers for L = 10-31cm-2s-1 is suggested by optimising the event yield at intermediate and final selection stages. A novel method to measure the rate of leptons from heavy flavour decays passing isolation requirements by isolating tt events in data is outlined. The task of the ATLAS silicon tracker is to track particles produced in proton-proton collisions in its centre, measuring their momenta and production vertices. The precise knowledge of the silicon tracker module positions and their orientation in space (alignment) down to some microns and fractions of a miliradian in the critical coordinates is of vital importance for large parts of the ambitious ATLAS physics program. In the second part of the thesis, the alignment of the ATLAS silicon tracker using the Robust Alignment algorithm and particle tracks is described. The algorithm is applied to align end-cap A of the pixel detector using cosmic ray particle tracks recorded during its on-surface commissioning in 2006. Finally, about 2M cosmic ray tracks collected by ATLAS in situ in autumn 2008 are utilised to provide a coherent alignment of the entire silicon tracker with the Robust Alignment algorithm.Comment: 236 pages, Ph.D. thesis; http://cdsweb.cern.ch/record/123186
    corecore