6,690 research outputs found
CMS Offline Conditions Framework and Services
Non-event data describing detector conditions change with time and come from different data sources. They are accessible by physicists within the offline event-processing applications for precise calibration of reconstructed data as well as for data-quality control purposes. Over the past years CMS has developed and deployed a software system managing such data. Object-relational mapping and the relational abstraction layer of the LHC persistency framework are the foundation; the offline conditions framework updates and delivers C++ data objects according to their validity. A high-level tag versioning system allows production managers to organize data in hierarchical view. A scripting API in python, command-line tools and a web service serve physicists in daily work. A mini-framework is available for handling data coming from external sources. Efficient data distribution over the worldwide network is guaranteed by a system of hierarchical web caches. The system has been tested and used in all major productions, test-beams and cosmic runs
3D numerical study of neutral gas dynamics in the DTT particle exhaust using the DSMC method
Recently the design of the divertor tokamak test (DTT) Facility divertor has been modified and consolidated. The new divertor design presents significant geometrical differences compared to the previous ITER-like one, including the presence of a more flattened dome shape. This paper presents a complete 3D numerical analysis of the neutral gas dynamics inside the DTT subdivertor area for the latest divertor design. The analysis has been performed based on the direct simulation Monte Carlo method by applying the DIVGAS simulator code. SOLEDGE2D-EIRENE plasma simulations have been performed for a deuterium plasma scenario at the maximum additional power in partially detached condition achieved by neon impurity seeding and the extracted information about the neutral particles has been imposed as incoming boundary conditions. The pumping efficiency of the DTT divertor is examined by considering various cases with respect to the pumping probability and the effect of the toroidal and poloidal leakages is quantified. The results show that a significant percentage of the incoming flux of neutrals returns back to the plasma site through the entry gaps (60% for deuterium and 40% for neon), and, consequentially, only a small percentage (âŒ2%â15%) of the incoming flux can be pumped out from the system. The toroidal leakages affect significantly the pumping performance of the divertor causing a significant decrease in the pumped flux (and also in the pressure at the pumping opening) about 37%â47% and 43%â56% for deuterium and neon respectively. It is discussed how many pumping ports are needed depending on the achievable pumping performance per port. The number can be reduced by closing the toroidal gaps. The analysis shows that enlarging the poloidal gaps by a factor of two causes a significant increase in the poloidal flux losses by a factor 1.7. It is also illustrated how the presence of the cooling pipes leads to conductance losses
A test of the Feynman scaling in the fragmentation region
The result of the direct measurement of the fragmentation region will be presented. The result will be obtained at the CERN proton-antiproton collider, being exposured the Silicon calorimeters inside beam pipe. This experiment clarifies a long riddle of cosmic ray physics, whether the Feynman scaling does villate at the fragmentation region or the Iron component is increasing at 10 to the 15th power eV
The LCG PI project: using interfaces for physics data analysis
In the context of the LHC computing grid (LCG) project, the applications area develops and maintains that part of the physics applications software and associated infrastructure that is shared among the LHC experiments. The "physicist interface" (PI) project of the LCG application area encompasses the interfaces and tools by which physicists will directly use the software, providing implementations based on agreed standards like the analysis systems subsystem (AIDA) interfaces for data analysis. In collaboration with users from the experiments, work has started with implementing the AIDA interfaces for (binned and unbinned) histogramming, fitting and minimization as well as manipulation of tuples. These implementations have been developed by re-using existing packages either directly or by using a (thin) layer of wrappers. In addition, bindings of these interfaces to the Python interpreted language have been done using the dictionary subsystem of the LCG applications area/SEAL project. The actual status and the future planning of the project will be presented
Air quality during uncontrolled fires: a multi-years case of study
Exposure to high level of pollutant as a consequence of uncontrolled fire is a issue that must be managed in the right way in order to protect environment and ensure a
safe habitat for humans, flora and fauna, because is
well know that emissions occurred during those events
could serious contaminate air soil and water, and some pollutant could be hazardous for the human health (Lemieux, 2002). During uncontrolled fires a lot of contaminants may be emitted, but in high concern for the human
health are persistent organic pollutants (POPs) and
PAHs (Coudon et al., 2019, Zhang et al., 2008).
Moreover uncontrolled burning could release polychlorinated biphenyls dioxin-like (PCB dl), that are generated as by-product during industrial combustions. Those pollutants are all of high concern for human health because they have well-known carcinogenic and
mutagenic properties, e.g. is well known that PAHs is the main carcinogenic constituent of ambient aerosol (Zhang et al., 2008, Fent et al., 2018; Ravindra et al., 2008). Moreover, PCDD/PCDF, frequently referred as
dioxin, are recognized as toxic chemical pollutant, with endocrine proprieties and toxic dioxin congener is classified as group1 carcinogen by the international agency for research in cancer (IARC).
The aim of this study is evaluate how uncontrolled
fires can affect air quality by characterizing persistent organic pollutant emitted from some events occurred
from 2015 to 2018 in Veneto region (northern Italy).
This area is one of the most polluted and urbanized areas in Europe (Larsen et al., 2012)and uncontrolled
fire can further enhance this severe situation, leading
air pollution to critical level.
During those accidental events the Environmental Protection Agency of Veneto (ARPAV), in order to monitoring the effect of fires, and ensure public health, collected some air samples using Hi-vol samplers
equipped with quartz fiber filter (QFF) for collecting
âparticulateâ phase compounds and a polyurethane foam plug (PUF) for retaining âgas-phaseâ compounds. Subsequently, polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/PCDF), polychlorinated biphenyls dioxin-like (PCB dl) and
Polycyclic aromatic hydrocarbons (PAHs), were analysed using a High Resolution Gas Chromatography
(HRGC), coupled with High Resolution Mass Spectrometry (HRMS). As expected results show large increase of
PCDD/PCDF, PCB dl and PAHs during and immediately after incidental fires, with differences in pollutant
composition. Itâs noticeable how, in a few time (hours to days) pollutant concentration presented a clear and strong drop, leading air quality to better conditions. This drop is probably due to meteorological factors, that will be investigated
Persistent storage of non-event data in the CMS databases
In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first results obtained during the 2008 and 2009 cosmic data taking are presented.In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first experience obtained during the 2008 and 2009 cosmic data taking are presented
Distributed Computing Grid Experiences in CMS
The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system
A non-convex control allocation strategy as energy-efficient torque distributors for on-road and off-road vehicles
A Vehicle with multiple drivetrains, like a hybrid electric one, is an over-actuated system that means there is an infinite number of combinations of torques that individual drivetrains can supply to provide a given total torque demand. Energy efficiency is considered as the secondary objective to determine the optimum solution among these feasible combinations. The resulting optimisation problem, which is nonlinear due to the multimodal operation of electric machines, must be solved quickly to comply with the stability requirements of the vehicle dynamics. A theorem is developed for the first time to formulate and parametrically solve the energyefficient torque distribution problem of a vehicle with multiple different drivetrains. The parametric solution is deployable on an ordinary electronic control unit (ECU) as a small-size lookup table that makes it significantly fast in operation. The fuel-economy of combustion engines, load transformations due to longitudinal and lateral accelerations, and traction efficiency of the off-road conditions are integrated into the developed theorem. Simulation results illustrate the effectiveness of the provided optimal strategy as torque distributors of on-road and off-road electrified vehicles with multiple different drivetrains
Diffractive Dissociation In The Interacting Gluon Model
We have extended the Interacting Gluon Model (IGM) to calculate diffractive
mass spectra generated in hadronic collisions. We show that it is possible to
treat both diffractive and non-diffractive events on the same footing, in terms
of gluon-gluon collisions. A systematic analysis of available data is
performed. The energy dependence of diffractive mass spectra is addressed. They
show a moderate narrowing at increasing energies. Predictions for LHC energies
are presented.Comment: 12 pages, latex, 14 figures (PostScript Files included); accepted for
publication in Phys. Rev. D (Feb.97
Discriminating signal from background using neural networks. Application to top-quark search at the Fermilab Tevatron
The application of Neural Networks in High Energy Physics to the separation
of signal from background events is studied. A variety of problems usually
encountered in this sort of analyses, from variable selection to systematic
errors, are presented. The top--quark search is used as an example to
illustrate the problems and proposed solutions.Comment: 11 pages, 3 figures, psfi
- âŠ