1,314 research outputs found

    Analysis of rainfall variability in the Logone catchment, Lake Chad basin

    Get PDF
    The socio-economic consequences posed by climate change in Africa are giving increasing emphasis to the need for trend analysis and detection of changes in hydro-climatic variables in data deficient areas. This study analyzes rainfall data from seventeen rain gauges unevenly distributed across the Logone catchment in the Lake Chad basin (LCB) over a fifty-year period (1951-2000). After quality control of the rainfall data using homogeneity tests, non-parametric MannKendall (MK) and Spearman rho tests were applied to detect the presence of trends. Trend magnitude was calculated using Sen’s Slope Estimator. Results of the homogeneity test showed that rainfall was homogeneous across the catchment. Trend analysis revealed the presence of negative trends for annual rainfall at all the stations. Results of long term trend analysis at a monthly time scale revealed the presence of statistically insignificant positive trends at 32% of the stations. Spatially, the analysis showed a clear distinction in rainfall magnitude between the semi-arid and Sudano zones. The slope of the trend lines for annual rainfall averaged over the respective zones was higher in the semi-arid zone (-4.37) compared to the Sudano zone (-4.02). However, the station with the greatest reduction in annual rainfall (-8.06 mm) was located in the Sudano zone

    Evaluating global reanalysis datasets as input for hydrological modelling in the Sudano-Sahel region

    Get PDF
    This paper investigates the potential of using global reanalysis datasets as input for hydrological modelling in the data-scarce Sudano-Sahel region. To achieve this, we used two global atmospheric reanalyses (Climate Forecasting System Reanalysis and European Center for Medium-Range Weather Forecasts (ECMWF) ERA-Interim) datasets and one global meteorological forcing dataset WATCH Forcing Data methodology applied to ERA-Interim (WFDEI). These datasets were used to drive the Soil and Water Assessment Tool (SWAT) in the Logone catchment in the Lake Chad basin. Model performance indicators after calibration showed that, at daily and monthly time steps, only WFDEI produced Nash Sutcliff Efficiency (NSE) and Coefficient of Determination (R2) values above 0.50. Despite a general underperformance compared to WFDEI, CFSR performed better than the ERA-Interim. Model uncertainty analysis after calibration showed that more than 60% of all daily and monthly observed streamflow values at all hydrometric stations were bracketed within the 95 percent prediction uncertainty (95PPU) range for all datasets. Results from this study also show significant differences in simulated actual evapotranspiration estimates from the datasets. Overall results showed that biased corrected WFDEI outperformed the two reanalysis datasets; meanwhile CFSR performed better than the ERA-Interim. We conclude that, in the absence of gauged hydro-meteorological data, WFDEI and CFSR could be used for hydrological modelling in data-scarce areas such as the Sudano-Sahel region

    Evaluating global reanalysis precipitation datasets with rain gauge measurements in the Sudano-Sahel region: case study of the Logone catchment, Lake Chad Basin

    Get PDF
    Africa has a paucity of long(term reliable meteorological ground station data and reanalysis products are used to provide the climate estimations that are important for climate change projections. This paper uses monthly observed precipitation records in the Logone catchment of the Lake Chad Basin (LCB) to evaluate the performance of two global reanalysis products: the Climate Forecasting System Reanalysis (CFSR) and ERA Interim datasets. The two reanalysis products reproduced the monthly, annual and decadal cycle of precipitation and variability relatively accurately albeit with some discrepancies. The catchment rainfall gradient was also well captured by the two products. There are good correlations between the reanalysis and rain gauge datasets though significant deviations exist, especially for CFSR. Both reanalysis products overestimated rainfall in 68% of the rain gauge stations. ERA Interim produced the lowest bias and mean absolute error (MAE) with average values of 2% and 6.5mm/month respectively compared to 15% and 34mm/month for the CFSR. However, both reanalysis products systematically underestimated annual rainfall in the catchment during the period 1997(2002 for ERA(Interim and 1998(2000 for CFSR. This research demonstrates that evaluating reanalysis products in remote areas like the Logone catchment enables users to identify artefacts inherent in reanalysis datasets. This will facilitate improvements in certain aspects of the reanalysis forecast model physics and parametrisation to improve reanalysis dataset quality. Our study concludes that the application of each reanalysis product in the catchment will depend on the purpose for which it is to be used and the spatial scale required

    Effect of single and multi-site calibration techniques on hydrological model performance, parameter estimation and predictive uncertainty: a case study in the Logone catchment, Lake Chad basin

    Get PDF
    Understanding hydrological processes at catchment scale through the use of hydrological model parameters is essential for enhancing water resource management. Given the difficulty of using lump parameters to calibrate distributed catchment hydrological models in spatially heterogeneous catchments, a multiple calibration technique was adopted to enhance model calibration in this study. Different calibration techniques were used to calibrate the Soil and Water Assessment Tool (SWAT) model at different locations along the Logone river channel. These were: single-site calibration (SSC); sequential calibration (SC); and simultaneous multi-site calibration (SMSC). Results indicate that it is possible to reveal differences in hydrological behavior between the upstream and downstream parts of the catchment using different parameter values. Using all calibration techniques, model performance indicators were mostly above the minimum threshold of 0.60 and 0.65 for Nash Sutcliff Efficiency (NSE) and coefficient of determination (R 2 ) respectively, at both daily and monthly time-steps. Model uncertainty analysis showed that more than 60% of observed streamflow values were bracketed within the 95% prediction uncertainty (95PPU) band after calibration and validation. Furthermore, results indicated that the SC technique out-performed the other two methods (SSC and SMSC). It was also observed that although the SMSC technique uses streamflow data from all gauging stations during calibration and validation, thereby taking into account the catchment spatial variability, the choice of each calibration method will depend on the application and spatial scale of implementation of the modelling results in the catchment

    Quantum phases in entropic dynamics

    Full text link
    In the Entropic Dynamics framework the dynamics is driven by maximizing entropy subject to appropriate constraints. In this work we bring Entropic Dynamics one step closer to full equivalence with quantum theory by identifying constraints that lead to wave functions that remain single-valued even for multi-valued phases by recognizing the intimate relation between quantum phases, gauge symmetry, and charge quantization.Comment: Presented at MaxEnt 2017, the 37th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (July 9-14, 2017, Jarinu, Brazil

    A scheduling theory framework for GPU tasks efficient execution

    Get PDF
    Concurrent execution of tasks in GPUs can reduce the computation time of a workload by overlapping data transfer and execution commands. However it is difficult to implement an efficient run- time scheduler that minimizes the workload makespan as many execution orderings should be evaluated. In this paper, we employ scheduling theory to build a model that takes into account the device capabili- ties, workload characteristics, constraints and objec- tive functions. In our model, GPU tasks schedul- ing is reformulated as a flow shop scheduling prob- lem, which allow us to apply and compare well known methods already developed in the operations research field. In addition we develop a new heuristic, specif- ically focused on executing GPU commands, that achieves better scheduling results than previous tech- niques. Finally, a comprehensive evaluation, showing the suitability and robustness of this new approach, is conducted in three different NVIDIA architectures (Kepler, Maxwell and Pascal).Proyecto TIN2016- 0920R, Universidad de Málaga (Campus de Excelencia Internacional Andalucía Tech) y programa de donación de NVIDIA Corporation

    On the Neutrality of Flowshop Scheduling Fitness Landscapes

    Get PDF
    Solving efficiently complex problems using metaheuristics, and in particular local searches, requires incorporating knowledge about the problem to solve. In this paper, the permutation flowshop problem is studied. It is well known that in such problems, several solutions may have the same fitness value. As this neutrality property is an important one, it should be taken into account during the design of optimization methods. Then in the context of the permutation flowshop, a deep landscape analysis focused on the neutrality property is driven and propositions on the way to use this neutrality to guide efficiently the search are given.Comment: Learning and Intelligent OptimizatioN Conference (LION 5), Rome : Italy (2011
    corecore