1,401 research outputs found

    Comparison of rainfall-runoff models for flood forecasting. Part 2: Calibration and evaluation of models

    Get PDF
    The purpose of the project “Comparison of Rainfall-Runoff Models for Flood Forecasting” is to provide guidance to the Environment Agency on the choice of rainfall-runoff model for use in different catchments for flood forecasting purposes. A literature review of models presented in the Part 1 Report recognised that whilst there is a plethora of “brand-name” models there is much similarity between many of them. A rather small set of model functions is common to many models and they differ in the detail of their configuration. Eight models were selected for a more detailed assessment of performance using data from nine catchments of varied character and spread throughout the regions of the Agency. The results are reported in this Part 2 Report along with conclusions and recommendations. The chosen models encompass those used operationally by the EA together with one overseas model and a simple distributed model previously developed for the Agency. Four of the models are lumped, conceptual models with continuous water accounting procedures: the Thames Catchment Model (TCM), the Midlands Catchment Runoff Model (MCRM), the Probability Distributed Moisture model (PDM), and the US National Weather Service Sacramento model (NWS). A fifth model, the Isolated Event Model (IEM), is an event model modified to operate continuously in real-time. Water balance principles are used for soil moisture accounting and water storage routing but an empirical function links the two components, controlling runoff production as a function of soil moisture. The sixth model is a simple Transfer Function (TF) model whilst the seventh is a constrained form of TF model, referred to as the Physically Realisable Transfer Function (PRTF). The TF types of model are black-box models which empirically relate rainfall and flow, can be related to unit hydrographs and can be subject to conceptual interpretation as forms of routing function. The last model, the Grid Model, is included as a simple form of distributed conceptual rainfall-runoff model suitable for use in flood forecasting and able to use weather radar estimates of rainfall in grid form. Each model is associated with an updating procedure whereby recent measurements of flow are incorporated into the model so as to improve forecast performance in real-time. The strategy for assessment used is based on first calibrating all models in “simulation-mode”, where each model is used to transform rainfall (and potential evaporation) to runoff without using flow to update the model forecast. Each model is then evaluated using periods of data not used for calibration. This simulation-mode evaluation serves to focus on the process model capabilities of a given model. Subsequently, each model is evaluated in “forecast-mode” in which an updating scheme is used to incorporate measurements of flow up to the “forecast time-origin”. This emulates the forecast performance expected operationally at different forecast lead times. Perfect foreknowledge of rainfall is assumed so as not to confound the model assessment with uncertainties in rainfall forecasts. The statistics used for model assessment are R2 and a Threshold CSI (Critical Success Index). The R2 statistic, giving the proportion of the variability in the flow accounted for by the model forecasts, is used to provide a broad guide to model performance. The Threshold CSI statistic is used to judge the efficacy of a model to correctly forecast the exceedence of a set of flow thresholds, particularly relevant to the use of a forecast to trigger an alert level of a given severity. Forecasts are also judged more informally via hydrograph plots and scatter plots of observed and forecast flood peaks. Whilst the main assessment relates to the use of raingauge estimates of rainfall as input to the models, for three of the catchments the assessment extends to the use of weather radar, both in raw form and as raingauge-calibrated radar estimates of rainfall. Whilst forecast accuracy is the focus of the model assessment, other issues are taken into consideration including ease of model configuration, initialisation and calibration. The form of model assessment used, employing long continuous records at a fixed 15 minute time-step typically eight months in duration, has meant that it has been difficult to emulate the operational performance of TF and PRTF models. These models are used by the Agency in “event mode” and commonly operate on baseflow separated runoff where baseflow is taken as the flow at the start of the event. The opportunity exists to manually adjust the model parameters affecting the volume, shape and timing of the forecast as the flood develops. Also, the model time-step and model order are commonly chosen with regard to the response characteristics of the catchment. The results reported here relate to TF/PRTF models without baseflow separation, using an automated method of model parameter adjustment and using a fixed model time-step and model order. The approach most resembles that used on the River Medway to support the operation of the Leigh Flood Barrier and the method of automated model gain adjustment has also been used in Anglian Region. The results relating to TF/PRTF models should be interpreted against this background. Overall, the results suggest that no one model consistently out-performs all others across all catchments. The TCM is one of the best performing models when judged using the R2 statistic whilst the PDM is more successful according to the Threshold CSI criterion (relevant to the issuing of flood alerts) and in forecasting flood peaks. Whilst the TCM is the most complex model and can be a challenge to calibrate, the PDM is of intermediate complexity. For a simple model, the IEM is surprisingly successful, particularly in terms of Threshold CSI. The simplest models, the TF and PRTF, are easiest to calibrate and initialise and can provide acceptable forecasts for some catchments. For smaller catchments in particular, TF models compare favourably with other models when used with error prediction rather than state updating. The MCRM proved sensitive to initial conditions of soil moisture but can work well on small-to-medium sized catchments. The NWS model, despite its large number of parameters, proved easy to calibrate using automatic optimisation and provided reasonable performance. Use of radar data gave as good, and sometimes slightly better results than using raingauge data alone, provided the radar was functioning well, and raingauge-calibration generally helped. The Grid Model was the only distributed model assessed and can utilise radar data in grid form. For the three catchments on which it was evaluated in simulation-mode it consistently gave the second best model simulations in terms of R2 but did less well according to the Threshold CSI criterion. Operationally, the TCM, PDM and IEM models appear to be the most appropriate flood forecasting models to use, of those assessed, the choice depending on the complexity of catchment response whilst all models have value in the right situation. The advantage of model familiarity acquired through past use is employed to guide more specific recommendations for each EA region. It is recommended that more automated applications of TF/PRTF models be pursued which accommodate the effects of catchment wetness on runoff production through effective rainfall transformations and incorporate baseflow via parallel “fast” and “slow” transfer function routing components. Opportunities for further research on model formulation and configuration, updating schemes, and catchment-scale rainfall estimation are identified

    Nucleic acid vibrational circular dichroism, absorption, and linear dichroism spectra. I. A DeVoe theory approach

    Get PDF
    Infrared (IR) vibrational circular dichroism (VCD), absorption, and linear dichroism (LD) spectra of four homopolyribonucleotides, poly(rA), poly(rG), poly(rC), and poly(rU), have been calculated, in the 1750–1550 cm-1 spectral region, using the DeVoe polarizability theory. A newly derived algorithm, which approximates the Hilbert transform of imaginaries to reals, was used in the calculations to obtain real parts of oscillator polarizabilities associated with each normal mode. The calculated spectra of the polynucleotides were compared with previously measured solution spectra. The good agreement between calculated and measured polynucleotide spectra indicates, for the first time, that the DeVoe theory is a useful means of calculating the VCD and IR absorption spectra of polynucleotides. For the first time, calculated DeVoe theory VCD and IR absorption spectra of oriented polynucleotides are presented. The calculated VCD spectra for the oriented polynucleotides are used to predict the spectra for such measurements made in the future. The calculated IR spectra for the oriented polynucleotides are useful in interpreting the linear dichroism of the polynucleotides

    Geometric Aspects of D-branes and T-duality

    Get PDF
    We explore the differential geometry of T-duality and D-branes. Because D-branes and RR-fields are properly described via K-theory, we discuss the (differential) K-theoretic generalization of T-duality and its application to the coupling of D-branes to RR-fields. This leads to a puzzle involving the transformation of the A-roof genera in the coupling.Comment: 26 pages, JHEP format, uses dcpic.sty; v2: references added, v3: minor change

    Numerical studies of the two- and three-dimensional gauge glass at low temperature

    Full text link
    We present results from Monte Carlo simulations of the two- and three-dimensional gauge glass at low temperature using the parallel tempering Monte Carlo method. Our results in two dimensions strongly support the transition being at T_c=0. A finite-size scaling analysis, which works well only for the larger sizes and lower temperatures, gives the stiffness exponent theta = -0.39 +/- 0.03. In three dimensions we find theta = 0.27 +/- 0.01, compatible with recent results from domain wall renormalization group studies.Comment: 7 pages, 10 figures, submitted to PR

    The Elliptic curves in gauge theory, string theory, and cohomology

    Full text link
    Elliptic curves play a natural and important role in elliptic cohomology. In earlier work with I. Kriz, thes elliptic curves were interpreted physically in two ways: as corresponding to the intersection of M2 and M5 in the context of (the reduction of M-theory to) type IIA and as the elliptic fiber leading to F-theory for type IIB. In this paper we elaborate on the physical setting for various generalized cohomology theories, including elliptic cohomology, and we note that the above two seemingly unrelated descriptions can be unified using Sen's picture of the orientifold limit of F-theory compactification on K3, which unifies the Seiberg-Witten curve with the F-theory curve, and through which we naturally explain the constancy of the modulus that emerges from elliptic cohomology. This also clarifies the orbifolding performed in the previous work and justifies the appearance of the w_4 condition in the elliptic refinement of the mod 2 part of the partition function. We comment on the cohomology theory needed for the case when the modular parameter varies in the base of the elliptic fibration.Comment: 23 pages, typos corrected, minor clarification

    Duality symmetry and the form fields of M-theory

    Full text link
    In previous work we derived the topological terms in the M-theory action in terms of certain characters that we defined. In this paper, we propose the extention of these characters to include the dual fields. The unified treatment of the M-theory four-form field strength and its dual leads to several observations. In particular we elaborate on the possibility of a twisted cohomology theory with a twist given by degrees greater than three.Comment: 12 pages, modified material on the differentia

    Too Afraid to Learn: Attitudes towards Statistics as a Barrier to Learning Statistics and to Acquiring Quantitative Skills

    Get PDF
    Quantitative skills are important for studying and understanding social reality. Political science students, however, experience difficulties in acquiring and retaining such skills. Fear of statistics has often been listed among the major causes for this problem. This study aims at understanding the underlying factors for this anxiety and proposes a potential remedy. More specifically, we advocate the integration of quantitative material into non-methodological courses. After assessing the influence of dispositional, course-related and person-related factors on the attitudes towards statistics among political science students, we provide insights into the relation between these attitudes on the one hand and the learning and retention of statistics skills on the other. Our results indicate that a curriculum-wide approach to normalise the use of quantitative methods can not only foster interest in statistics but also foster retention of the acquired skills

    Twisted K-Theory of Lie Groups

    Full text link
    I determine the twisted K-theory of all compact simply connected simple Lie groups. The computation reduces via the Freed-Hopkins-Teleman theorem to the CFT prescription, and thus explains why it gives the correct result. Finally I analyze the exceptions noted by Bouwknegt et al.Comment: 16 page

    Nature of the Spin-glass State in the Three-dimensional Gauge Glass

    Full text link
    We present results from simulations of the gauge glass model in three dimensions using the parallel tempering Monte Carlo technique. Critical fluctuations should not affect the data since we equilibrate down to low temperatures, for moderate sizes. Our results are qualitatively consistent with earlier work on the three and four dimensional Edwards-Anderson Ising spin glass. We find that large scale excitations cost only a finite amount of energy in the thermodynamic limit, and that those excitations have a surface whose fractal dimension is less than the space dimension, consistent with a scenario proposed by Krzakala and Martin, and Palassini and Young.Comment: 5 pages, 7 figure
    corecore