42 research outputs found

    Challenging the challenge: handling data in the Gigabit/s range

    Full text link
    The ALICE experiment at CERN will propose unprecedented requirements for event building and data recording. New technologies will be adopted as well as ad-hoc frameworks, from the acquisition of experimental data up to the transfer onto permanent media and its later access. These issues justify a careful, in-depth planning and preparation. The ALICE Data Challenge is a very important step of this development process where simulated detector data is moved from dummy data sources up to the recording media using processing elements and data-paths as realistic as possible. We will review herein the current status of past, present and future ALICE Data Challenges, with particular reference to the sessions held in 2002 when - for the first time - streams worth one week of ALICE data were recorded onto tape media at sustained rates exceeding 300 MB/s.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 9 pages, PDF. PSN MOGT00

    Computing at the Petabyte scale with the WLCG

    No full text
    The Worldwide LHC Computing Grid provides and operates the Grid infrastructure used by the experiments of the Large Hadron Collider at CERN for their data processing. The huge amount of data to be distributed and analysed, the number of collaborating centres and users and the diversity of the underlying resources make the WLCG the largest and most complex research Grid currently in operation. In this paper we describe how the WLCG successfully provides a scalable system for the LHC experiments and its recent successes during data challenges and data taking

    The CREAM-CE: First experiences, results and requirements of the four LHC experiments

    No full text
    In terms of the gLite middleware, the current LCG-CE used by the four LHC experiments is about to be deprecated. The new CREAM-CE service (Computing Resource Execution And Management) has been approved to replace the previous service. CREAM-CE is a lightweight service created to handle job management operations at the CE level. It is able to accept requests both via the gLite WMS service and also via direct submission for transmission to the local batch system. This flexible duality provides the experiments with a large level of freedom to adapt the service to their own computing models, but at the same time it requires a careful follow up of the requirements and tests of the experiments to ensure that their needs are fulfilled before real data taking. In this paper we present the current testing results of the four LHC experiments concerning this new service. The operations procedures, which have been elaborated together with the experiment support teams will be discussed. Finally, the experiments requirements and the expectations for both the sites and the service itself are exposed in detai

    Crossing mice deficient in eNOS with placental-specific Igf2 knockout mice: A new model of fetal growth restriction

    No full text
    We tested the hypothesis that crossing two mouse models of fetal growth restriction (FGR) of differing phenotype would induce more severe FGR than either model alone. Female endothelial nitric oxide synthase knockout mice (eNOS(−/−)) were mated with placental-specific Igf2 knockout males (P0). Resultant fetuses were no more growth restricted than those with P0 deletion alone. However, P0 deletion attenuated the reduced placental system A amino acid transporter activity previously observed in eNOS(−/−) mice. Manipulating maternal and fetal genotypes provides a means to compare maternal and fetal regulation of fetal growth
    corecore