419 research outputs found

    Extending the Fermi-LAT Data Processing Pipeline to the Grid

    Full text link
    The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. (abstract abridged)Comment: This is an author-created, un-copyedited version of an article accepted for publication in Journal of Physics: Conference Series. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at http://dx.doi.org/10.1088/1742-6596/396/3/03212

    Instance nationale et multi-communauté de DIRAC pour France Grilles

    No full text
    DIRAC [DIRAC] [TSA-08] is a software framework for building distributed computing systems. It was primarily designed forthe needs of the LHCb [LHCb] Collaboration, and is now used by many other communities within EGI [EGI] as a primary wayof accessing grid resources. In France, dedicated instances of the service have been deployed in different locations toanswer specific needs. Building upon this existing expertise, France Grilles [FG] initiated last year a project to deploy anational, multi-community instance in order to share expertise and provide a consistent high-quality service. After describingDIRAC main aims and functionalities, this paper presents the motivations for such a project, as well as the wholeorganizational and technical process that led to the establishment of a production instance that already serves 13communities: astro.vo.eu-egee.org, biomed, esr, euasia, gilda, glast.org, prod.vo.eu-eela.eu, superbvo.org,vo.formation.idgrilles.fr, vo.france-asia.org, vo.france-grilles.fr, vo.msfg.fr and vo.mcia.fr

    The Cherenkov Telescope Array Observatory workflow management system

    Get PDF
    The Cherenkov Telescope Array Observatory (CTAO) is the next generation ground-based observatory for gamma-ray astronomy at very high energies. It is expected to produce about 2 PB of raw data each year and to manage a global data volume which will grow through the years to reach about 100 PB in 2030. In addition, CTAO will require a high computing capacity for data processing and Monte Carlo simulations, of the order of hundreds of millions of CPU HS06 hours per year. To meet these requirements, CTAO will adopt a distributed computing model using 4 academic data centers, and will use the DIRAC framework as its workload management system. In the past ten years, to optimize the instrument design and study its performances, CTAO has used the European Grid Infrastructure (EGI) to run massive Monte Carlo campaigns. In order to handle these campaigns and to automatize simulation and data processing workflows, we have developed a production system prototype based on DIRAC. Recently, we have also developed a user interface allowing for the configuration and submission of complex workflows. In this contribution we present the production system prototype, its user interface for workflow management as well as its application to CTAO workflows

    Measurement of the forward Z boson production cross-section in pp collisions at s=13\sqrt{s} = 13 TeV

    Get PDF
    A measurement of the production cross-section of Z bosons in pp collisions at s=13\sqrt{s} = 13 TeV is presented using dimuon and dielectron final states in LHCb data. The cross-section is measured for leptons with pseudorapidities in the range 2.0η4.52.0 \eta 4.5, transverse momenta pT20p_\text{T} 20 GeV and dilepton invariant mass in the range 60m()12060 m(\ell\ell) 120 GeV. The integrated cross-section from averaging the two final states is \begin{equation*}\sigma_{\text{Z}}^{\ell\ell} = 194.3 \pm 0.9 \pm 3.3 \pm 7.6\text{ pb,}\end{equation*} where the first uncertainty is statistical, the second is due to systematic effects, and the third is due to the luminosity determination. In addition, differential cross-sections are measured as functions of the Z boson rapidity, transverse momentum and the angular variable ϕη\phi^*_\eta

    Les droits disciplinaires des fonctions publiques : « unification », « harmonisation » ou « distanciation ». A propos de la loi du 26 avril 2016 relative à la déontologie et aux droits et obligations des fonctionnaires

    Get PDF
    The production of tt‾ , W+bb‾ and W+cc‾ is studied in the forward region of proton–proton collisions collected at a centre-of-mass energy of 8 TeV by the LHCb experiment, corresponding to an integrated luminosity of 1.98±0.02 fb−1 . The W bosons are reconstructed in the decays W→ℓν , where ℓ denotes muon or electron, while the b and c quarks are reconstructed as jets. All measured cross-sections are in agreement with next-to-leading-order Standard Model predictions.The production of ttt\overline{t}, W+bbW+b\overline{b} and W+ccW+c\overline{c} is studied in the forward region of proton-proton collisions collected at a centre-of-mass energy of 8 TeV by the LHCb experiment, corresponding to an integrated luminosity of 1.98 ±\pm 0.02 \mbox{fb}^{-1}. The WW bosons are reconstructed in the decays WνW\rightarrow\ell\nu, where \ell denotes muon or electron, while the bb and cc quarks are reconstructed as jets. All measured cross-sections are in agreement with next-to-leading-order Standard Model predictions

    DIRAC current, upcoming and planned capabilities and technologies

    Get PDF
    DIRAC is the interware for building and operating large scale distributed computing systems. It is adopted by multiple collaborations from various scientific domains for implementing their computing models. DIRAC provides a framework and a rich set of ready-to-use services for Workload, Data and Production Management tasks of small, medium and large scientific communities having different computing requirements. The base functionality can be easily extended by custom components supporting community specific workflows. DIRAC is at the same time an aging project, and a new DiracX project is taking shape for replacing DIRAC in the long term. This contribution will highlight DIRAC’s current, upcoming and planned capabilities and technologies, and how the transition to DiracX will take place. Examples include, but are not limited to, adoption of security tokens and interactions with Identity Provider services, integration of Clouds and High Performance Computers, interface with Rucio, improved monitoring and deployment procedures

    LHCb upgrade software and computing : technical design report

    Get PDF
    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis

    Physics case for an LHCb Upgrade II - Opportunities in flavour physics, and beyond, in the HL-LHC era

    Get PDF
    The LHCb Upgrade II will fully exploit the flavour-physics opportunities of the HL-LHC, and study additional physics topics that take advantage of the forward acceptance of the LHCb spectrometer. The LHCb Upgrade I will begin operation in 2020. Consolidation will occur, and modest enhancements of the Upgrade I detector will be installed, in Long Shutdown 3 of the LHC (2025) and these are discussed here. The main Upgrade II detector will be installed in long shutdown 4 of the LHC (2030) and will build on the strengths of the current LHCb experiment and the Upgrade I. It will operate at a luminosity up to 2×1034 cm−2s−1, ten times that of the Upgrade I detector. New detector components will improve the intrinsic performance of the experiment in certain key areas. An Expression Of Interest proposing Upgrade II was submitted in February 2017. The physics case for the Upgrade II is presented here in more depth. CP-violating phases will be measured with precisions unattainable at any other envisaged facility. The experiment will probe b → sl+l−and b → dl+l− transitions in both muon and electron decays in modes not accessible at Upgrade I. Minimal flavour violation will be tested with a precision measurement of the ratio of B(B0 → μ+μ−)/B(Bs → μ+μ−). Probing charm CP violation at the 10−5 level may result in its long sought discovery. Major advances in hadron spectroscopy will be possible, which will be powerful probes of low energy QCD. Upgrade II potentially will have the highest sensitivity of all the LHC experiments on the Higgs to charm-quark couplings. Generically, the new physics mass scale probed, for fixed couplings, will almost double compared with the pre-HL-LHC era; this extended reach for flavour physics is similar to that which would be achieved by the HE-LHC proposal for the energy frontier

    New algorithms for identifying the flavour of B<sup>0</sup>mesons using pions and protons

    Get PDF
    Two new algorithms for use in the analysis of pp collision are developed to identify the flavour of B0mesons at production using pions and protons from the hadronization process. The algorithms are optimized and calibrated on data, using B0→D-π+ decays from pp collision data collected by LHCb at centre-of-mass energies of 7 and 8 TeV . The tagging power of the new pion algorithm is 60% greater than the previously available one; the algorithm using protons to identify the flavour of a B0meson is the first of its kind.</p
    corecore