896 research outputs found

    The CLIC Programme: Towards a Staged e+e- Linear Collider Exploring the Terascale : CLIC Conceptual Design Report

    Full text link
    This report describes the exploration of fundamental questions in particle physics at the energy frontier with a future TeV-scale e+e- linear collider based on the Compact Linear Collider (CLIC) two-beam acceleration technology. A high-luminosity high-energy e+e- collider allows for the exploration of Standard Model physics, such as precise measurements of the Higgs, top and gauge sectors, as well as for a multitude of searches for New Physics, either through direct discovery or indirectly, via high-precision observables. Given the current state of knowledge, following the observation of a 125 GeV Higgs-like particle at the LHC, and pending further LHC results at 8 TeV and 14 TeV, a linear e+e- collider built and operated in centre-of-mass energy stages from a few-hundred GeV up to a few TeV will be an ideal physics exploration tool, complementing the LHC. In this document, an overview of the physics potential of CLIC is given. Two example scenarios are presented for a CLIC accelerator built in three main stages of 500 GeV, 1.4 (1.5) TeV, and 3 TeV, together with operating schemes that will make full use of the machine capacity to explore the physics. The accelerator design, construction, and performance are presented, as well as the layout and performance of the experiments. The proposed staging example is accompanied by cost estimates of the accelerator and detectors and by estimates of operating parameters, such as power consumption. The resulting physics potential and measurement precisions are illustrated through detector simulations under realistic beam conditions.Comment: 84 pages, published as CERN Yellow Report https://cdsweb.cern.ch/record/147522

    Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    Get PDF
    This is the published version. Copyright 2014 American Geophysical UnionThis paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based “local” methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative “bucket-style” hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models

    Alternative configurations of quantile regression for estimating predictive uncertainty in water forecasts for the upper Severn River: a comparison

    Get PDF
    The present study comprises an intercomparison of different configurations of a statistical post-processor that is used to estimate predictive hydrological uncertainty. It builds on earlier work by Weerts, Winsemius and Verkade (2011; hereafter referred to as WWV2011), who used the quantile regression technique to estimate predictive hydrological uncertainty using a deterministic water level forecast as a predictor. The various configurations are designed to address two issues with the WWV2011 implementation: (i) quantile crossing, which causes non-strictly rising cumulative predictive distributions, and (ii) the use of linear quantile models to describe joint distributions that may not be strictly linear. Thus, four configurations were built: (i) a ''classical" quantile regression, (ii) a configuration that implements a non-crossing quantile technique, (iii) a configuration where quantile models are built in normal space after application of the normal quantile transformation (NQT) (similar to the implementation used by WWV2011), and (iv) a configuration that builds quantile model separately on separate domains of the predictor. Using each configuration, four reforecasting series of water levels at 14 stations in the upper Severn River were established. The quality of these four series was intercompared using a set of graphical and numerical verification metrics. Intercomparison showed that reliability and sharpness vary across configurations, but in none of the configurations do these two forecast quality aspects improve simultaneously. Further analysis shows that skills in terms of the Brier skill score, mean continuous ranked probability skill score and relative operating characteristic score is very similar across the four configuration

    State updating of a distributed hydrological model with Ensemble Kalman Filtering: Effects of updating frequency and observation network density on forecast accuracy

    Get PDF
    This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model. The Ensemble Kalman filter (EnKF) is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property). <br><br> Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km<sup>2</sup>), a relatively quickly responding catchment in the Belgian Ardennes. We assess the impact on the forecasted discharge of (1) various sets of the spatially distributed discharge gauges and (2) the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty

    Attenuated live infectious bronchitis virus QX vaccine disseminates slowly to target organs distant from the site of inoculation

    Get PDF
    Infectious bronchitis (IB) is a highly contagious respiratory disease of poultry, caused by the avian coronavirus infectious bronchitis virus (IBV). Currently, one of the most relevant genotypes circulating worldwide is IBV-QX (GI-19), for which vaccines have been developed by passaging virulent QX strains in embryonated chicken eggs. Here we explored the attenuated phenotype of a commercially available QX live vaccine, IB Primo QX, in specific pathogens free broilers. At hatch, birds were inoculated with QX vaccine or its virulent progenitor IBV-D388, and postmortem swabs and tissues were collected each day up to eight days post infection to assess viral replication and morphological changes. In the trachea, viral RNA replication and protein expression were comparable in both groups. Both viruses induced morphologically comparable lesions in the trachea, albeit with a short delay in the vaccinated birds. In contrast, in the kidney, QX vaccine viral RNA was nearly absent, which coincided with the lack of any morphological changes in this organ. This was in contrast to high viral RNA titers and abundant lesions in the kidney after IBV D388 infection. Furthermore, QX vaccine showed reduced ability to reach and replicate in conjunctivae and intestines including cloaca, resulting in significantly lower titers and delayed protein expression, respectively. Nephropathogenic IBVs might reach the kidney also via an ascending route from the cloaca, based on our observation that viral RNA was detected in the cloaca one day before detection in the kidney. In the kidney distal tubular segments, collecting ducts and ureter were positive for viral antigen. Taken together, the attenuated phenotype of QX vaccine seems to rely on slower dissemination and lower replication in target tissues other than the site of inoculation

    Estimating Regionalized Hydrological Impacts of Climate Change Over Europe by Performance-Based Weighting of CORDEX Projections

    Get PDF
    Ensemble projections of future changes in discharge over Europe show large variation. Several methods for performance-based weighting exist that have the potential to increase the robustness of the change signal. Here we use future projections of an ensemble of three hydrological models forced with climate datasets from the Coordinated Downscaling Experiment - European Domain (EURO-CORDEX). The experiment is set-up for nine river basins spread over Europe that hold different climate and catchment characteristics. We evaluate the ensemble consistency and apply two weighting approaches; the Climate model Weighting by Independence and Performance (ClimWIP) that focuses on meteorological variables and the Reliability Ensemble Averaging (REA) in our study applied to discharge statistics per basin. For basins with a strong climate signal, in Southern and Northern Europe, the consistency in the set of projections is large. For rivers in Central Europe the differences between models become more pronounced. Both weighting approaches assign high weights to single General Circulation Models (GCMs). The ClimWIP method results in ensemble mean weighted changes that differ only slightly from the non-weighted mean. The REA method influences the weighted mean more, but the weights highly vary from basin to basin. We see that high weights obtained through past good performance can provide deviating projections for the future. It is not apparent that the GCM signal dominates the overall change signal, i.e., there is no strong intra GCM consistency. However, both weighting methods favored projections from the same GCM
    • 

    corecore