3,764 research outputs found

    Dependable Distributed Computing for the International Telecommunication Union Regional Radio Conference RRC06

    Full text link
    The International Telecommunication Union (ITU) Regional Radio Conference (RRC06) established in 2006 a new frequency plan for the introduction of digital broadcasting in European, African, Arab, CIS countries and Iran. The preparation of the plan involved complex calculations under short deadline and required dependable and efficient computing capability. The ITU designed and deployed in-situ a dedicated PC farm, in parallel to the European Organization for Nuclear Research (CERN) which provided and supported a system based on the EGEE Grid. The planning cycle at the RRC06 required a periodic execution in the order of 200,000 short jobs, using several hundreds of CPU hours, in a period of less than 12 hours. The nature of the problem required dynamic workload-balancing and low-latency access to the computing resources. We present the strategy and key technical choices that delivered a reliable service to the RRC06

    Sagittarius dwarf spheroidal galaxy observed by H.E.S.S

    Full text link
    Dwarf spheroidal galaxies are characterized by a large measured mass-to-light ratio and are not expected to be the site of high-luminosity non-thermal high-energy gamma-ray emissions. Therefore they are among the most promising candidates for indirect searches of dark matter particle annihilation signals in gamma rays. The Sagittarius dwarf spheroidal galaxy has been regularly observed by the High Energy Stereoscopic System (H.E.S.S.) of Cherenkov telescopes for more than 90 hours, searching for TeV gamma-ray emission from annihilation of dark matter particles. In absence of a significant signal, new constraints on the annihilation crosssection of the dark matter particles applicable for Majorana Weakly Interacting Massive Particles (WIMPs) are derived.Comment: In Proceedings of the 33rd International Cosmic Ray Conference (ICRC2013), Rio de Janeiro (Brazil

    An architecture to manage security services for cloud applications

    Get PDF
    The uptake of virtualization and cloud technologies has pushed novel development and operation models for the software, bringing more agility and automation. Unfortunately, cyber-security paradigms have not evolved at the same pace and are not yet able to effectively tackle the progressive disappearing of a sharp security perimeter. In this paper, we describe a novel cyber-security architecture for cloud-based distributed applications and network services. We propose a security orchestrator that controls pervasive, lightweight, and programmable security hooks embedded in the virtual functions that compose the cloud application, pursuing better visibility and more automation in this domain. Our approach improves existing management practice for service orchestration, by decoupling the management of the business logic from that of security. We also describe the current implementation stage for a programmable monitoring, inspection, and enforcement framework, which represents the ground technology for the realization of the whole architecture

    OGSA/Globus Evaluation for Data Intensive Applications

    Full text link
    We present an architecture of Globus Toolkit 3 based testbed intended for evaluation of applicability of the Open Grid Service Architecture (OGSA) for Data Intensive Applications.Comment: To be published in the proceedings of the XIX International Symposium on Nuclear Electronics and Computing (NEC'2003), Bulgaria, Varna, 15-20 September, 200

    Improved sensitivity of H.E.S.S.-II through the fifth telescope focus system

    Full text link
    The Imaging Atmospheric Cherenkov Telescope (IACT) works by imaging the very short flash of Cherenkov radiation generated by the cascade of relativistic charged particles produced when a TeV gamma ray strikes the atmosphere. This energetic air shower is initiated at an altitude of 10-30 km depending on the energy and the arrival direction of the primary gamma ray. Whether the best image of the shower is obtained by focusing the telescope at infinity and measuring the Cherenkov photon angles or focusing on the central region of the shower is a not obvious question. This is particularly true for large size IACT for which the depth of the field is much smaller. We address this issue in particular with the fifth telescope (CT5) of the High Energy Stereoscopic System (H.E.S.S.); a 28 m dish large size telescope recently entered in operation and sensitive to an energy threshold of tens of GeVs. CT5 is equipped with a focus system, its working principle and the expected effect of focusing depth on the telescope sensitivity at low energies (50-200 GeV) is discussed.Comment: In Proceedings of the 33rd International Cosmic Ray Conference (ICRC2013), Rio de Janeiro (Brazil

    A prototype large-angle photon veto detector for the P326 experiment at CERN

    Full text link
    The P326 experiment at the CERN SPS has been proposed with the purpose of measuring the branching ratio for the decay K^+ \to \pi^+ \nu \bar{\nu} to within 10%. The photon veto system must provide a rejection factor of 10^8 for \pi^0 decays. We have explored two designs for the large-angle veto detectors, one based on scintillating tiles and the other using scintillating fibers. We have constructed a prototype module based on the fiber solution and evaluated its performance using low-energy electron beams from the Frascati Beam-Test Facility. For comparison, we have also tested a tile prototype constructed for the CKM experiment, as well as lead-glass modules from the OPAL electromagnetic barrel calorimeter. We present results on the linearity, energy resolution, and time resolution obtained with the fiber prototype, and compare the detection efficiency for electrons obtained with all three instruments.Comment: 8 pages, 9 figures, 2 tables. Presented at the 2007 IEEE Nuclear Science Symposium, Honolulu HI, USA, 28 October - 3 November 200

    The COMPASS Event Store in 2002

    Full text link
    COMPASS, the fixed-target experiment at CERN studying the structure of the nucleon and spectroscopy, collected over 260 TB during summer 2002 run. All these data, together with reconstructed events information, were put from the beginning in a database infrastructure based on Objectivity/DB and on the hierarchical storage manager CASTOR. The experience in the usage of the database is reviewed and the evolution of the system outlined.Comment: Talk from the 2003 conference: "Computing in High Energy and Nuclear Physics" (CHEP03), La Jolla, Ca, USA, March 2003, 6 pages. PSN MOKT01
    • …
    corecore