519 research outputs found

    The POOL Data Storage, Cache and Conversion Mechanism

    Full text link
    The POOL data storage mechanism is intended to satisfy the needs of the LHC experiments to store and analyze the data from the detector response of particle collisions at the LHC proton-proton collider. Both the data rate and the data volumes will largely differ from the past experience. The POOL data storage mechanism is intended to be able to cope with the experiment's requirements applying a flexible multi technology data persistency mechanism. The developed technology independent approach is flexible enough to adopt new technologies, take advantage of existing schema evolution mechanisms and allows users to access data in a technology independent way. The framework consists of several components, which can be individually adopted and integrated into existing experiment frameworks.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 5 pages, PDF, 6 figures. PSN MOKT00

    CMS Offline Conditions Framework and Services

    Get PDF
    Non-event data describing detector conditions change with time and come from different data sources. They are accessible by physicists within the offline event-processing applications for precise calibration of reconstructed data as well as for data-quality control purposes. Over the past years CMS has developed and deployed a software system managing such data. Object-relational mapping and the relational abstraction layer of the LHC persistency framework are the foundation; the offline conditions framework updates and delivers C++ data objects according to their validity. A high-level tag versioning system allows production managers to organize data in hierarchical view. A scripting API in python, command-line tools and a web service serve physicists in daily work. A mini-framework is available for handling data coming from external sources. Efficient data distribution over the worldwide network is guaranteed by a system of hierarchical web caches. The system has been tested and used in all major productions, test-beams and cosmic runs

    Determining the Characteristics of Diluted Wastewater in the Lagoon

    Get PDF
    “Drink a minimum of two litres or eight glasses of 8 liquid ounces daily” is a statement with which we are all familiar; it is the recommendation by the UK Food Standard Agency (FSA), corroborated by the US-EPA (UKFSA 2002, & USEPA, 1945). However in Ghana, the Chemu Lagoon remains a main source of drinking water, while there is direct waste discharge from Tema Oil Refinery, with neither the management nor the government at large finding solutions to health risks it pose. It is in this interest that this study is aimed at analysing the extent of impurities and designing a filtration system which can be used for filtering diluted water of its type. Samples of the wastewater discharge as well as diluted lagoon water were analysed to determine its chlorine level, turbidity, etc. Having confirmed the presence of the impurities in the water, a filtration system was designed to serve as a means of purifying the water. A CFD/FloXpress analysis was carried out to determine the suitability of the filtration system while analysis of filtered water also confirmed that, production of the filters could help in enhancing the purification of the diluted Chemu lagoon and similar water system. Keywords: Chemu, Pollution, Turbidimeter

    Design of a Model Filtration System and Performing CFD/ Floxpress Analysis on It

    Get PDF
    With regard to earlier experiment conducted to determine the characteristics of diluted waste water, a confirmation was obtained experienced by the levels shown using the litmus paper. This then provided a basis for the design of the filtration system which was simulated to ascertain its suitability. This can be said by “drink a minimum of two litres or eight glasses of 8 liquid ounces daily” is a statement with which we are all familiar; it is the recommendation by the UK Food Standard Agency (FSA), corroborated by the US-EPA (UKFSA 2002, & USEPA, 1945).  In the high-pressure pipeline, the water energy may dissipate after flowing through the channels and the flow rate can be controlled to meet the water need of the life. To ensure the emitter’s hydraulic performance, before the fabrication of emitter, computational fluid dynamics (CFD) is used to predict emitter’s flow rate and analyze its hydraulic performance under various water pressures. The quality of the emitter has an important effect on the reliability, life span of the drip irrigation system and irrigation quality. Usually, the structure of the emitter channel is very complex with a dimension. A CFD/FloXpress analysis was carried out to determine the suitability of the filtration system while analysis of filtered water also confirmed that, production of the filters could help in enhancing the purification of the diluted Chemu lagoon and similar water system( Ing. Govi and Gablah, 2015). Keywords: Carbon, Concentration, Filtration, Irrigatio

    Persistent storage of non-event data in the CMS databases

    Get PDF
    In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first results obtained during the 2008 and 2009 cosmic data taking are presented.In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first experience obtained during the 2008 and 2009 cosmic data taking are presented

    POOL development status and production experience

    Get PDF
    The pool of persistent objects for LHC (POOL) project, part of the large Hadron collider (LHC) computing grid (LCG), is now entering its third year of active development. POOL provides the baseline persistency framework for three LHC experiments. It is based on a strict component model, insulating experiment software from a variety of storage technologies. This paper gives a brief overview of the POOL architecture, its main design principles and the experience gained with integration into LHC experiment frameworks. It also presents recent developments in the POOL works areas of relational database abstraction and object storage into relational database management systems (RDBMS) systems

    Measurement of the Ratio Gamma(KL -> pi+ pi-)/Gamma(KL -> pi e nu) and Extraction of the CP Violation Parameter |eta+-|

    Full text link
    We present a measurement of the ratio of the decay rates Gamma(KL -> pi+ pi-)/Gamma(KL -> pi e nu), denoted as Gamma(K2pi)/Gamma(Ke3). The analysis is based on data taken during a dedicated run in 1999 by the NA48 experiment at the CERN SPS. Using a sample of 47000 K2pi and five million Ke3 decays, we find Gamma(K2pi)/Gamma(Ke3) = (4.835 +- 0.022(stat) +- 0.016(syst)) x 10^-3. From this we derive the branching ratio of the CP violating decay KL -> pi+ pi- and the CP violation parameter |eta+-|. Excluding the CP conserving direct photon emission component KL -> pi+ pi- gamma, we obtain the results BR(KL -> pi+ pi-) = (1.941 +- 0.019) x 10^-3 and |eta+-| = (2.223 +- 0.012) x 10^-3.Comment: 20 pages, 7 figures, accepted by Phys. Lett.

    Large genotype–phenotype study in carriers of D4Z4 borderline alleles provides guidance for facioscapulohumeral muscular dystrophy diagnosis

    Get PDF
    Facioscapulohumeral muscular dystrophy (FSHD) is a myopathy with prevalence of 1 in 20,000. Almost all patients affected by FSHD carry deletions of an integral number of tandem 3.3 kilobase repeats, termed D4Z4, located on chromosome 4q35. Assessment of size of D4Z4 alleles is commonly used for FSHD diagnosis. However, the extended molecular testing has expanded the spectrum of clinical phenotypes. In particular, D4Z4 alleles with 9–10 repeat have been found in healthy individuals, in subjects with FSHD or affected by other myopathies. These findings weakened the strict relationship between observed phenotypes and their underlying genotypes, complicating the interpretation of molecular findings for diagnosis and genetic counseling. In light of the wide clinical variability detected in carriers of D4Z4 alleles with 9–10 repeats, we applied a standardized methodology, the Comprehensive Clinical Evaluation Form (CCEF), to describe and characterize the phenotype of 244 individuals carrying D4Z4 alleles with 9–10 repeats (134 index cases and 110 relatives). The study shows that 54.5% of index cases display a classical FSHD phenotype with typical facial and scapular muscle weakness, whereas 20.1% present incomplete phenotype with facial weakness or scapular girdle weakness, 6.7% display minor signs such as winged scapula or hyperCKemia, without functional motor impairment, and 18.7% of index cases show more complex phenotypes with atypical clinical features. Family studies revealed that 70.9% of relatives carrying 9–10 D4Z4 reduced alleles has no motor impairment, whereas a few relatives (10.0%) display a classical FSHD phenotype. Importantly all relatives of index cases with no FSHD phenotype were healthy carriers. These data establish the low penetrance of D4Z4 alleles with 9–10 repeats. We recommend the use of CCEF for the standardized clinical assessment integrated by family studies and further molecular investigation for appropriate diagnosis and genetic counseling. Especially in presence of atypical phenotypes and/or sporadic cases with all healthy relatives is not possible to perform conclusive diagnosis of FSHD, but all these cases need further studies for a proper diagnosis, to search novel causative genetic defects or investigate environmental factors or co-morbidities that may trigger the pathogenic process. These evidences are also fundamental for the stratification of patients eligible for clinical trials. Our work reinforces the value of large genotype–phenotype studies to define criteria for clinical practice and genetic counseling in rare diseases

    Measurement of the branching ratios of the decays Xi0 --> Sigma+ e- nubar and anti-Xi0 --> anti-Sigma+ e+ nu

    Full text link
    From 56 days of data taking in 2002, the NA48/1 experiment observed 6316 Xi0 --> Sigma+ e- nubar candidates (with the subsequent Sigma+ --> p pi0 decay) and 555 anti-Xi0 --> anti-Sigma+ e+ nu candidates with background contamination of 215+-44 and 136+-8 events, respectively. From these samples, the branching ratios BR(Xi0 --> Sigma+ e- nubar)= (2.51+-0.03stat+-0.09syst)E(-4) and BR(anti-Xi0 --> anti-Sigma+ e+ nu)= (2.55+-0.14stat+-0.10syst)E(-4) were measured allowing the determination of the CKM matrix element |Vus| = 0.209+0.023-0.028. Using the Particle Data Group average for |Vus| obtained in semileptonic kaon decays, we measured the ratio g1/f1 = 1.20+-0.05 of the axial-vector to vector form factors.Comment: 16 pages, 11 figures Submitted to Phys.Lett.
    • 

    corecore