6 research outputs found

    The CMS Integration Grid Testbed

    Get PDF
    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.Comment: CHEP 2003 MOCT01

    Association of Dental Caries and Anthropometric Measures among Primary School Children

    No full text
    Aim: This study aimed to investigate an association between dental caries status and anthropometric measures in primary school children. Methods and Materials: An analytical cross-sectional study (n = 376) was conducted among primary school children (age range = 6–9 years) registered in private schools. Non-clinical data was gathered from parents of participating children through a self-administered structured questionnaire as well as from the children through an interviewer-administered questionnaire. Clinical data included the examination of dental caries using dmft/DMFT index and anthropometric measures including calculated z-scores of height-for-age (HAZ), weight-for-age (WAZ), BMI-for-age (BAZ), and physical examination. Inferential statistics included Kruskal Wallis and linear regression for univariate and multivariate analysis respectively. Results: The proportion of dental caries in primary and secondary dentition was 67.6% and 8.2% respectively. A significant association was observed between dental caries status and HAZ, WAZ, and BAZ (p < 0.001). An inverse relation was found between low, medium, and high dental caries categories and anthropometric measures. Conclusions: In the primary dentition, dental caries were significantly and inversely related to weight-for-age, height-for-age, and BMI-for-age. Hence, it can be concluded that among the low-income population dental caries is associated with lower anthropometric outcomes in children and therefore caries management should be considered an approach impacting overall health and quality of life

    The CMS Integration Grid Testbed

    No full text
    San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world’s first continuously available, functioning grids. 1

    Distributed Analysis in CMS

    No full text
    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities
    corecore