62 research outputs found

    From hydrological modelling to decision support

    Get PDF
    Decision support for planning and management of water resources needs to consider many target criteria simultaneously like water availability, water quality, flood protection, agriculture, ecology, etc. Hydrologic models provide information about the water balance components and are fundamental for the simulation of ecological processes. Objective of this contribution is to discuss the suitability of classical hydrologic models on one hand and of complex eco-hydrologic models on the other hand to be used as part of decision support systems. The discussion is based on results from two model comparison studies. It becomes clear that none of the hydrologic models tested fulfils all requirements in an optimal sense. Regarding the simulation of water quality parameters like nitrogen leaching a high uncertainty needs to be considered. Recommended for decision support is a hybrid metamodel approach, which comprises a hydrologic model, empirical relationships for the less dynamic processes and makes use of simulation results from complex eco-hydrologic models through second-order modelling at a generalized level

    Stochastic precipitation modeling using circulation patterns to analyze climate impact on floods

    Get PDF
    This paper presents the conditioning of a precipitation model to objectively classified circulation patterns (CP). The application of CPs is considered useful with regards to model accuracy improvement and preparation of a downscaling model by using CPs classified with climate model data. As this study aims to produce rainfall as input for derived flood frequency analyses, the validation focuses on extreme values and precipitation events. The analysis is carried out by modifications of a well tested alternating renewal precipitation model

    The value of weather radar data for the estimation of design storms - an analysis for the Hannover region

    Get PDF
    Pure radar rainfall, station rainfall and radar-station merging products are analysed regarding extreme rainfall frequencies with durations from 5 min to 6 h and return periods from 1 year to 30 years. Partial duration series of the extremes are derived from the data and probability distributions are fitted. The performance of the design rainfall estimates is assessed based on cross validations for observed station points, which are used as reference. For design rainfall estimation using the pure radar data, the pixel value at the station location is taken; for the merging products, spatial interpolation methods are applied. The results show, that pure radar data are not suitable for the estimation of extremes. They usually lead to an overestimation compared to the observations, which is opposite to the usual behaviour of the radar rainfall. The merging products between radar and station data on the other hand lead usually to an underestimation. They can only outperform the station observations for longer durations. The main problem for a good estimation of extremes seems to be the poor radar data quality

    Uncertainty estimation of regionalised depth–duration–frequency curves in Germany

    Get PDF
    The estimation of rainfall depth–duration–frequency (DDF) curves is necessary for the design of several water systems and protection works. These curves are typically estimated from observed locations, but due to different sources of uncertainties, the risk may be underestimated. Therefore, it becomes crucial to quantify the uncertainty ranges of such curves. For this purpose, the propagation of different uncertainty sources in the regionalisation of the DDF curves for Germany is investigated. Annual extremes are extracted at each location for different durations (from 5 min up to 7 d), and local extreme value analysis is performed according to Koutsoyiannis et al. (1998). Following this analysis, five parameters are obtained for each station, from which four are interpolated using external drift kriging, while one is kept constant over the whole region. Finally, quantiles are derived for each location, duration and given return period. Through a non-parametric bootstrap and geostatistical spatial simulations, the uncertainty is estimated in terms of precision (width of 95 % confidence interval) and accuracy (expected error) for three different components of the regionalisation: (i) local estimation of parameters, (ii) variogram estimation and (iii) spatial estimation of parameters. First, two methods were tested for their suitability in generating multiple equiprobable spatial simulations: sequential Gaussian simulations (SGSs) and simulated annealing (SA) simulations. Between the two, SGS proved to be more accurate and was chosen for the uncertainty estimation from spatial simulations. Next, 100 realisations were run at each component of the regionalisation procedure to investigate their impact on the final regionalisation of parameters and DDF curves, and later combined simulations were performed to propagate the uncertainty from the main components to the final DDF curves. It was found that spatial estimation is the major uncertainty component in the chosen regionalisation procedure, followed by the local estimation of rainfall extremes. In particular, the variogram uncertainty had very little effect on the overall estimation of DDF curves. We conclude that the best way to estimate the total uncertainty consisted of a combination between local resampling and spatial simulations, which resulted in more precise estimation at long observation locations and a decline in precision at unobserved locations according to the distance and density of the observations in the vicinity. Through this combination, the total uncertainty was simulated by 10 000 runs in Germany, and it indicated that, depending on the location and duration level, tolerance ranges from ± 10 %–30 % for low-return periods (lower than 10 years) and from ± 15 %–60 % for high-return periods (higher than 10 years) should be expected, with the very short durations (5 min) being more uncertain than long durations

    Areal rainfall estimation using moving cars as rain gauges - A modelling study

    Get PDF
    Optimal spatial assessment of short-time step precipitation for hydrological modelling is still an important research question considering the poor observation networks for high time resolution data. The main objective of this paper is to present a new approach for rainfall observation. The idea is to consider motorcars as moving rain gauges with windscreen wipers as sensors to detect precipitation. This idea is easily technically feasible if the cars are provided with GPS and a small memory chip for recording the coordinates, car speed and wiper frequency. This study explores theoretically the benefits of such an approach. For that a valid relationship between wiper speed and rainfall rate considering uncertainty was assumed here. A simple traffic model is applied to generate motorcars on roads in a river basin. Radar data are used as reference rainfall fields. Rainfall from these fields is sampled with a conventional rain gauge network and with several dynamic networks consisting of moving motorcars, using different assumptions such as accuracy levels for measurements and sensor equipment rates for the car networks. Those observed point rainfall data from the different networks are then used to calculate areal rainfall for different scales. Ordinary kriging and indicator kriging are applied for interpolation of the point data with the latter considering uncertain rainfall observation by cars e.g. according to a discrete number of windscreen wiper operation classes. The results are compared with the values from the radar observations. The study is carried out for the 3300 km 2 Bode river basin located in the Harz Mountains in Northern Germany. The results show, that the idea is theoretically feasible and motivate practical experiments. Only a small portion of the cars needed to be equipped with sensors for sufficient areal rainfall estimation. Regarding the required sensitivity of the potential rain sensors in cars it could be shown, that often a few classes for rainfall observation are enough for satisfactory areal rainfall estimation. The findings of the study suggest also a revisiting of the rain gauge network optimisation problem.DW

    A semi-parametric hourly space–time weather generator

    Get PDF
    Long continuous time series of meteorological variables (i.e. rainfall, temperature and radiation) are required for applications such as derived flood frequency analyses. However, observed time series are generally too short, too sparse in space or incomplete, especially at the sub-daily timestep. Stochastic weather generators overcome this problem by generating time series of arbitrary length. This study presents a major revision to an existing space–time hourly rainfall model based on a point alternating renewal process, now coupled to a k-NN resampling model for conditioned simulation of non-rainfall climate variables. The point-based rainfall model is extended into space by the resampling of simulated rainfall events via a simulated annealing optimisation approach. This approach enforces observed spatial dependency as described by three bivariate spatial rainfall criteria. A new non-sequential branched shuffling approach is introduced which allows the modelling of large station networks (N>50) with no significant loss in the spatial dependence structure. Modelling of non-rainfall climate variables, i.e. temperature, humidity and radiation, is achieved using a non-parametric k-nearest neighbour (k-NN) resampling approach, coupled to the space–time rainfall model via the daily catchment rainfall state. As input, a gridded daily observational dataset (HYRAS) was used. A final deterministic disaggregation step was then performed on all non-rainfall climate variables to achieve an hourly output temporal resolution. The proposed weather generator was tested on 400 catchments of varying size (50–20 000 km2) across Germany, comprising 699 sub-daily rainfall recording stations. Results indicate no major loss of model performance with increasing catchment size and a generally good reproduction of observed climate and rainfall statistics

    Improving radar-based rainfall nowcasting by a nearest-neighbour approach – Part 1: Storm characteristics

    Get PDF
    The nowcast of rainfall storms at fine temporal and spatial resolutions is quite challenging due to the unpredictable nature of rainfall at such scales. Typically, rainfall storms are recognized by weather radar and extrapolated in the future by the Lagrangian persistence. However, storm evolution is much more dynamic and complex than the Lagrangian persistence, leading to short forecast horizons, especially for convective events. Thus, the aim of this paper is to investigate the improvement that past similar storms can introduce to the object-oriented radar-based nowcast. Here we propose a nearest-neighbour approach that measures first the similarity between the “to-be-nowcasted” storm and past observed storms and later uses the behaviour of the past most similar storms to issue either a single nowcast (by averaging the 4 most similar storm responses) or an ensemble nowcast (by considering the 30 most similar storm responses). Three questions are tackled here. (i) What features should be used to describe storms in order to check for similarity? (ii) How should similarity between past storms be measured? (iii) Is this similarity useful for object-oriented nowcast? For this purpose, individual storms from 110 events in the period 2000–2018 recognized within the Hanover Radar Range (R∼115 km2), Germany, are used as a basis for investigation. A “leave-one-event-out” cross-validation is employed to test the nearest-neighbour approach for the prediction of the area, mean intensity, the x and y velocity components, and the total lifetime of the to-be-nowcasted storm for lead times from + 5 min up to + 3 h. Prior to the application, two importance analysis methods (Pearson correlation and partial information correlation) are employed to identify the most important predictors. The results indicate that most of the storms behave similarly, and the knowledge obtained from such similar past storms helps to capture better the storm dissipation and improves the nowcast compared to the Lagrangian persistence, especially for convective events (storms shorter than 3 h) and longer lead times (from 1 to 3 h). The main advantage of the nearest-neighbour approach is seen when applied in a probabilistic way (with the 30 closest neighbours as ensembles) rather than in a deterministic way (averaging the response from the four closest neighbours). The probabilistic approach seems promising, especially for convective storms, and it can be further improved by either increasing the sample size, employing more suitable methods for the predictor identification, or selecting physical predictors

    Evaluation of different calibration strategies for large scale continuous hydrological modelling

    Get PDF
    For the analysis of climate impact on flood flows and flood frequency in macroscale river basins, hydrological models can be forced by several sets of hourly long-term climate time series. Considering the large number of model units, the small time step and the required recalibrations for different model forcing an efficient calibration strategy and optimisation algorithm are essential. This study investigates the impact of different calibration strategies and different optimisation algorithms on the performance and robustness of a semi-distributed model. The different calibration strategies were (a) Lumped, (b) 1-Factor, (c) Distributed and (d) Regionalisation. The latter uses catchment characteristics and estimates parameter values via transfer functions. These methods were applied in combination with three different optimisation algorithms: PEST, DDS, and SCE. In addition to the standard temporal evaluation of the calibration strategies, a spatial evaluation was applied. This was done by transferring the parameters from calibrated catchments to uncalibrated ones and validating the model performance of these uncalibrated catchments. The study was carried out for five sub-catchments of the Aller-Leine River Basin in Northern Germany. The best result for temporal evaluation was achieved by using the combination of the DDS optimisation with the Distributed strategy. The Regionalisation method obtained the weakest performance for temporal evaluation. However, for spatial evaluation the Regionalisation indicated more robust models, closely followed by the Lumped method. The 1-Factor and the Distributed strategy showed clear disadvantages regarding spatial parameter transferability. For the parameter estimation based on catchment descriptors as required for ungauged basins, the Regionalisation strategy seems to be a promising tool particularly in climate impact analysis and for hydrological modelling in general.Ministry for Science and Culture of Lower Saxon

    Influence of spatial interpolation methods for climate variables on the simulation of discharge and nitrate fate with SWAT

    Get PDF
    For ecohydrological modeling climate variables are needed on subbasin basis. Since they usually originate from point measurements spatial interpolation is required during preprocessing. Different interpolation methods yield data of varying quality, which can strongly influence modeling results. Four interpolation methods to be compared were selected: nearest neighbour, inverse distance, ordinary kriging, and kriging with external drift (Goovaerts, 1997). This study presents three strategies to evaluate the influence of the interpolation method on the modeling results of discharge and nitrate load in the river in a mesoscale river catchment (∼1000 km2) using the Soil and Water Assessment Tool (SWAT, Neitsch et al., 2005) model: I. Automated calibration of the model with a mixed climate data set and consecutive application of the four interpolated data sets. II. Consecutive automated calibration of the model with each of the four climate data sets. III. Random generation of 1000 model parameter sets and consecutive application of the four interpolated climate data sets on each of the 1000 realisations, evaluating the number of realisations above a certain quality criterion threshold. Results show that strategies I and II are not suitable for evaluation of the quality of the interpolated data. Strategy III however proves a significant influence of the interpolation method on nitrate modeling. A rank order from the simplest to the most sophisticated method is visible, with kriging with external drift (KED) outperforming all others. Responsible for this behaviour is the variable temperature, which benefits most from more sophisticated methods and at the same time is the main driving force for the nitrate cycle. The missing influence of the interpolation methods on discharge modeling is explained by a much higher measuring network density for precipitation than for all other climate variables
    corecore