101,429 research outputs found

    The AliEn system, status and perspectives

    Full text link
    AliEn is a production environment that implements several components of the Grid paradigm needed to simulate, reconstruct and analyse HEP data in a distributed way. The system is built around Open Source components, uses the Web Services model and standard network protocols to implement the computing platform that is currently being used to produce and analyse Monte Carlo data at over 30 sites on four continents. The aim of this paper is to present the current AliEn architecture and outline its future developments in the light of emerging standards.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 10 pages, Word, 10 figures. PSN MOAT00

    Kompics: a message-passing component model for building distributed systems

    Get PDF
    The Kompics component model and programming framework was designedto simplify the development of increasingly complex distributed systems. Systems built with Kompics leverage multi-core machines out of the box and they can be dynamically reconfigured to support hot software upgrades. A simulation framework enables deterministic debugging and reproducible performance evaluation of unmodified Kompics distributed systems. We describe the component model and show how to program and compose event-based distributed systems. We present the architectural patterns and abstractions that Kompics facilitates and we highlight a case study of a complex distributed middleware that we have built with Kompics. We show how our approach enables systematic development and evaluation of large-scale and dynamic distributed systems

    The Architecture of MEG Simulation and Analysis Software

    Full text link
    MEG (Mu to Electron Gamma) is an experiment dedicated to search for the μ+→e+γ\mu^+ \rightarrow e^+\gamma decay that is strongly suppressed in the Standard Model but predicted in several Super Symmetric extensions of it at an accessible rate. MEG is a small-size experiment (≈50−60\approx 50-60 physicists at any time) with a life span of about 10 years. The limited human resource available, in particular in the core offline group, emphasized the importance of reusing software and exploiting existing expertise. Great care has been devoted to provide a simple system that hides implementation details to the average programmer. That allowed many members of the collaboration to contribute to the development of the software of the experiment with limited programming skill. The offline software is based on two frameworks: {\bf REM} in FORTRAN 77 used for the event generation and detector simulation package {\bf GEM}, based on GEANT 3, and {\bf ROME} in C++ used in the readout simulation {\bf Bartender} and in the reconstruction and analysis program {\bf Analyzer}. Event display in the simulation is based on GEANT 3 graphic libraries and in the reconstruction on ROOT graphic libraries. Data are stored in different formats in various stage of the processing. The frameworks include utilities for input/output, database handling and format conversion transparent to the user.Comment: Presented at the IEEE NSS Knoxville, 2010 Revised according to referee's remarks Accepted by European Physical Journal Plu

    Polish grid infrastructure for science and research

    Full text link
    Structure, functionality, parameters and organization of the computing Grid in Poland is described, mainly from the perspective of high-energy particle physics community, currently its largest consumer and developer. It represents distributed Tier-2 in the worldwide Grid infrastructure. It also provides services and resources for data-intensive applications in other sciences.Comment: Proceeedings of IEEE Eurocon 2007, Warsaw, Poland, 9-12 Sep. 2007, p.44
    • …
    corecore