6,405 research outputs found

    Custodial SU(2) Violation and the Origin of Fermion Masses

    Full text link
    Custodial SU(2)SU(2) breaking due to dynamical fermion masses is studied in a rather general context and it is shown how some well known limiting cases are correctly described. The type of ``gap equation'' which can systematically lead to extra negative contributions to the so--called ρ\rho--parameter is emphasized. Furthermore general model independent features are discussed and it is shown how electro--weak precision measurements can be sensitive to the fermion content and/or dynamical features of a given theory.Comment: HD-THEP-92-55, 18 pages and 2 pages of figures appended as Postscript fil

    Heat flux measurements and modelling in the RFX-mod experiment

    Get PDF
    The knowledge of edge plasma transport parameters and plasma edge phenomena is a key element in the design of the first wall for a magnetically confined fusion experiment. In RFX-mod heat flux measurement and edge transport modelling have been done to improve the understanding of this aspect. Heat flux deposition profiles have been evaluated from infrared temperature measurements of insertable graphite limiters. They were inserted up to 12 mm into the reversed field pinch plasma of ohmically heated discharges with Ip= 0.6Ă·1.0 MA, ne= 0.5Ă·3·1019 m−3 (n/nG< 0.7) and total power of about 10Ă·15 MW. Strong asymmetries in heat flux deposition have been measured in poloidal direction at low density between the electron and the ion drift side and smaller ones in toroidal direction when q(a)≠0. The poloidal asymmetry has been associated to the presence of superthermal electrons [1] while the toroidal one has been less clearly identified as due to the small toroidal extension of the limiters. To account for the 2D deposition nature of heat load on the surface of the employed limiters, a simple 3D code has been developed to evaluate heat flux from temperature data. In this way at the deeper limiter insertions a heat flux decay length of about 2 mm and 2.5 mm has been evaluated in electron and ion drift sides. Modelling of the evaluated heat fluxes has been done using the SOLEDGE2D-EIRENE edge code [2]. This fluid code is well suited for the RFX-mod wall limiter configuration because, thanks to the implemented penalization technique, the computational domain can be extended up to the entire first wall. Edge modelling has shown that measured decay lengths are compatible with energy diffusion coefficients in Scrape Off Layer (SOL) smaller than those commonly evaluated at plasma edge; the cause of the reduced diffusion in the SOL will be discussed in the paper

    The LCG PI project: using interfaces for physics data analysis

    Get PDF
    In the context of the LHC computing grid (LCG) project, the applications area develops and maintains that part of the physics applications software and associated infrastructure that is shared among the LHC experiments. The "physicist interface" (PI) project of the LCG application area encompasses the interfaces and tools by which physicists will directly use the software, providing implementations based on agreed standards like the analysis systems subsystem (AIDA) interfaces for data analysis. In collaboration with users from the experiments, work has started with implementing the AIDA interfaces for (binned and unbinned) histogramming, fitting and minimization as well as manipulation of tuples. These implementations have been developed by re-using existing packages either directly or by using a (thin) layer of wrappers. In addition, bindings of these interfaces to the Python interpreted language have been done using the dictionary subsystem of the LCG applications area/SEAL project. The actual status and the future planning of the project will be presented

    Long-term trends of PM10-bound arsenic, cadmium, nickel, and lead across the Veneto region (NE Italy)

    Get PDF
    Since the mid-90s, the European Community has adopted increasingly stringent air quality standards. Consequently, air quality has generally improved across Europe. However, current EU standards are still breached in some European hotspots. The Veneto region (NE Italy) lies in the eastern part of the Po Valley, a major European hotspot for air pollution, where EU standards for particulate matter, nitrogen oxides and ozone are still breached at some sites. This study aims to analyse the PM10-bound arsenic, cadmium, nickel, and lead concentrations over a 10 years-long period (2010-2020) in the Veneto Region by using data collected by the local environmental protection agency (ARPAV) in 20 sampling stations mostly distributed across the plain areas of the region and categorized as rural (RUR), urban (URB), and suburban (SUB) background, industrial (IND) and traffic (TRA) hotspots (Figure 1). The comprehensive dataset discussed in this study was statistically investigated to detect the seasonal trends, their relationship with other air pollutants and meteorological parameters and their spatial variations at a regional scale. This study completes previous air quality studies over the Veneto region for gaseous pollutants and bulk PM10 (Masiol et al. 2017). Samplings were carried out according to CEN EN 12341:1998 standard on quartz fibre filters and were continuous for 24 h, starting at midnight. The gravimetric determination of PM10 mass was measured following the CEN EN 12341:2014 standard. The elemental analysis was performed using an ICP-MS (Agilent 7700) after acid digestion (EN 14902:2005). The trends were analysed using different approaches on the monthly-averaged data. The shape of trends and their seasonal variations were assessed through the seasonal-trend decomposition time series procedure based on “Loess” (STL). The linear trends were computed by the Mann-Kendall trend test (p &lt; 0.05) and the Theil-Sen nonparametric estimator of slope (MK-TS). Since this latter analysis assumes monotonic linear trends and does not consider the shape of trends, the presence of possible breakpoints was investigated using the piecewise regression. Generally, monthly patterns of all analysed elements show higher concentrations during winter, following PM10 concentrations. Some exceptions were detected and discussed. Results of trend analysis indicate statistically significant negative (decreasing) or null linear trends in almost all stations. A few positive (increasing) but not statistically significant trends were also detected. Some sites showed rapid decreases occurred in short periods and linked to peculiar events or local causes. Among others, several sites across the Venice area showed significant drops of arsenic concentrations after the REACH (Registration Evaluation Authorisation of Chemicals) implementation (Formenton et al., 2021). Data used in this study are provided by ARPAV (Agenzia Regionale per la Prevenzione e Protezione Ambientale del Veneto, https://www.arpa.veneto.it/)

    Air quality during uncontrolled fires: a multi-years case of study

    Get PDF
    Exposure to high level of pollutant as a consequence of uncontrolled fire is a issue that must be managed in the right way in order to protect environment and ensure a safe habitat for humans, flora and fauna, because is well know that emissions occurred during those events could serious contaminate air soil and water, and some pollutant could be hazardous for the human health (Lemieux, 2002). During uncontrolled fires a lot of contaminants may be emitted, but in high concern for the human health are persistent organic pollutants (POPs) and PAHs (Coudon et al., 2019, Zhang et al., 2008). Moreover uncontrolled burning could release polychlorinated biphenyls dioxin-like (PCB dl), that are generated as by-product during industrial combustions. Those pollutants are all of high concern for human health because they have well-known carcinogenic and mutagenic properties, e.g. is well known that PAHs is the main carcinogenic constituent of ambient aerosol (Zhang et al., 2008, Fent et al., 2018; Ravindra et al., 2008). Moreover, PCDD/PCDF, frequently referred as dioxin, are recognized as toxic chemical pollutant, with endocrine proprieties and toxic dioxin congener is classified as group1 carcinogen by the international agency for research in cancer (IARC). The aim of this study is evaluate how uncontrolled fires can affect air quality by characterizing persistent organic pollutant emitted from some events occurred from 2015 to 2018 in Veneto region (northern Italy). This area is one of the most polluted and urbanized areas in Europe (Larsen et al., 2012)and uncontrolled fire can further enhance this severe situation, leading air pollution to critical level. During those accidental events the Environmental Protection Agency of Veneto (ARPAV), in order to monitoring the effect of fires, and ensure public health, collected some air samples using Hi-vol samplers equipped with quartz fiber filter (QFF) for collecting “particulate” phase compounds and a polyurethane foam plug (PUF) for retaining “gas-phase” compounds. Subsequently, polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/PCDF), polychlorinated biphenyls dioxin-like (PCB dl) and Polycyclic aromatic hydrocarbons (PAHs), were analysed using a High Resolution Gas Chromatography (HRGC), coupled with High Resolution Mass Spectrometry (HRMS). As expected results show large increase of PCDD/PCDF, PCB dl and PAHs during and immediately after incidental fires, with differences in pollutant composition. It’s noticeable how, in a few time (hours to days) pollutant concentration presented a clear and strong drop, leading air quality to better conditions. This drop is probably due to meteorological factors, that will be investigated

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Diffractive Dissociation In The Interacting Gluon Model

    Get PDF
    We have extended the Interacting Gluon Model (IGM) to calculate diffractive mass spectra generated in hadronic collisions. We show that it is possible to treat both diffractive and non-diffractive events on the same footing, in terms of gluon-gluon collisions. A systematic analysis of available data is performed. The energy dependence of diffractive mass spectra is addressed. They show a moderate narrowing at increasing energies. Predictions for LHC energies are presented.Comment: 12 pages, latex, 14 figures (PostScript Files included); accepted for publication in Phys. Rev. D (Feb.97

    Persistent storage of non-event data in the CMS databases

    Get PDF
    In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first results obtained during the 2008 and 2009 cosmic data taking are presented.In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first experience obtained during the 2008 and 2009 cosmic data taking are presented
    • 

    corecore