790 research outputs found
The CLIC Programme: Towards a Staged e+e- Linear Collider Exploring the Terascale : CLIC Conceptual Design Report
This report describes the exploration of fundamental questions in particle
physics at the energy frontier with a future TeV-scale e+e- linear collider
based on the Compact Linear Collider (CLIC) two-beam acceleration technology. A
high-luminosity high-energy e+e- collider allows for the exploration of
Standard Model physics, such as precise measurements of the Higgs, top and
gauge sectors, as well as for a multitude of searches for New Physics, either
through direct discovery or indirectly, via high-precision observables. Given
the current state of knowledge, following the observation of a 125 GeV
Higgs-like particle at the LHC, and pending further LHC results at 8 TeV and 14
TeV, a linear e+e- collider built and operated in centre-of-mass energy stages
from a few-hundred GeV up to a few TeV will be an ideal physics exploration
tool, complementing the LHC. In this document, an overview of the physics
potential of CLIC is given. Two example scenarios are presented for a CLIC
accelerator built in three main stages of 500 GeV, 1.4 (1.5) TeV, and 3 TeV,
together with operating schemes that will make full use of the machine capacity
to explore the physics. The accelerator design, construction, and performance
are presented, as well as the layout and performance of the experiments. The
proposed staging example is accompanied by cost estimates of the accelerator
and detectors and by estimates of operating parameters, such as power
consumption. The resulting physics potential and measurement precisions are
illustrated through detector simulations under realistic beam conditions.Comment: 84 pages, published as CERN Yellow Report
https://cdsweb.cern.ch/record/147522
Recommended from our members
Estimation of uncertainty in flood forecasts - a comparison of methods
The scientific literature has many methods for estimating uncertainty, however, there is a lack of information about the characteristics, merits and limitations of the individual methods, particularly for making decisions in practice. This paper provides an overview of the different uncertainty methods for flood forecasting that are reported in literature, concentrating on two established approaches defined as the ensemble and the statistical approach. Owing to the variety of flood forecasting and warning systems in operation, the question âwhich uncertainty method is most suitable for which applicationâ is difficult to answer readily. The paper aims to assist practitioners in understanding how to match an uncertainty quantification method to their particular application using two flood forecasting system case studies in Belgium and Canada. These two specific applications of uncertainty estimation from the literature are compared, illustrating statistical and ensemble methods, and indicating the information and output that these two types of methods offer. The advantages, disadvantages and application of the two different types of method are identified. Although there is no one âbestâ uncertainty method to fit all forecasting systems, this review helps to explain the current commonly used methods from the available literature for the non-specialist
Integrated fecal microbiomeâmetabolome signatures reflect stress and serotonin metabolism in irritable bowel syndrome
To gain insight into the complex microbiome-gut-brain axis in irritable bowel syndrome (IBS) several modalities of biological and clinical data must be combined. We aimed to identify profiles of faecal microbiota and metabolites associated with IBS and to delineate specific phenotypes of IBS that represent potential pathophysiological mechanisms. Faecal metabolites were measured using proton Nuclear Magnetic Resonance (1H-NMR) spectroscopy and gut microbiome using Shotgun Metagenomic Sequencing (MGS) in a combined dataset of 142 IBS patients and 120 healthy controls (HC) with extensive clinical, biological and phenotype information. Data were analysed using support vector classification and regression and kernel t-SNE. Microbiome and metabolome profiles could distinguish IBS and HC with an area-under-the-receiver-operator-curve (AUC) of 77.3% and 79.5%, respectively, but this could be improved by combining microbiota and metabolites to 83.6%. No significant differences in predictive ability of the microbiome-metabolome data were observed between the three classical, stool pattern-based, IBS subtypes. However, unsupervised clustering showed distinct subsets of IBS patients based on faecal microbiome-metabolome data. These clusters could be related plasma levels of serotonin and its metabolite 5-hydroxyindoleacetate, effects of psychological stress on gastrointestinal symptoms, onset of IBS after stressful events, medical history of previous abdominal surgery, dietary caloric intake and IBS symptom duration. Furthermore, pathways in metabolic reaction networks were integrated with microbiota data, that reflect the host-microbiome interactions in IBS. The identified microbiome-metabolome signatures for IBS, associated with altered serotonin metabolism and unfavourable stress-response related to gastrointestinal symptoms, support the microbiota-gut-brain link in the pathogenesis of IBS
Recommended from our members
Continental and global scale flood forecasting systems
Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction
Global QCD Analysis and the CTEQ Parton Distributions
The CTEQ program for the determination of parton distributions through a
global QCD analysis of data for various hard scattering processes is fully
described. A new set of distributions, CTEQ3, incorporating several new types
of data is reported and compared to the two previous sets of CTEQ
distributions. Comparison with current data is discussed in some detail. The
remaining uncertainties in the parton distributions and methods to further
reduce them are assessed. Comparisons with the results of other global analyses
are also presented.Comment: (Change in Latex style only: 2up style removed since many don't have
it.) 35 pages, 23 figures separately submitted as uuencoded compressed
ps-file; Michigan State Report # MSU-HEP/41024 and CTEQ 40
Measuring Parton Densities in the Pomeron
We present a program to measure the parton densities in the pomeron using
diffractive deep inelastic scattering and diffractive photoproduction, and to
test the resulting parton densities by applying them to other processes such as
the diffractive production of jets in hadron-hadron collisions. Since QCD
factorization has been predicted NOT to apply to hard diffractive scattering,
this program of fitting and using parton densities might be expected to fail.
Its success or failure will provide useful information on the space-time
structure of the pomeron.Comment: Contains revisions based on Phys. Rev. D referee comments. RevTeX
version 3, epsf, 31 pages. Uuencoded compressed postscript figures appended.
Uncompressed postscript files available at
ftp://ftp.phys.psu.edu/pub/preprint/psuth136
The Azimuthal Decorrelation of Jets Widely Separated in Rapidity
This study reports the first measurement of the azimuthal decorrelation
between jets with pseudorapidity separation up to five units. The data were
accumulated using the D{\O}detector during the 1992--1993 collider run of the
Fermilab Tevatron at 1.8 TeV. These results are compared to
next--to--leading order (NLO) QCD predictions and to two leading--log
approximations (LLA) where the leading--log terms are resummed to all orders in
. The final state jets as predicted by NLO QCD
show less azimuthal decorrelation than the data. The parton showering LLA Monte
Carlo {\small HERWIG} describes the data well; an analytical LLA prediction
based on BFKL resummation shows more decorrelation than the data.Comment: 6 pages with 4 figures, all uuencoded and gzippe
- âŠ