158 research outputs found

    An Evolutionary Approach to Adaptive Image Analysis for Retrieving and Long-term Monitoring Historical Land Use from Spatiotemporally Heterogeneous Map Sources

    Get PDF
    Land use changes have become a major contributor to the anthropogenic global change. The ongoing dispersion and concentration of the human species, being at their orders unprecedented, have indisputably altered Earth’s surface and atmosphere. The effects are so salient and irreversible that a new geological epoch, following the interglacial Holocene, has been announced: the Anthropocene. While its onset is by some scholars dated back to the Neolithic revolution, it is commonly referred to the late 18th century. The rapid development since the industrial revolution and its implications gave rise to an increasing awareness of the extensive anthropogenic land change and led to an urgent need for sustainable strategies for land use and land management. By preserving of landscape and settlement patterns at discrete points in time, archival geospatial data sources such as remote sensing imagery and historical geotopographic maps, in particular, could give evidence of the dynamic land use change during this crucial period. In this context, this thesis set out to explore the potentials of retrospective geoinformation for monitoring, communicating, modeling and eventually understanding the complex and gradually evolving processes of land cover and land use change. Currently, large amounts of geospatial data sources such as archival maps are being worldwide made online accessible by libraries and national mapping agencies. Despite their abundance and relevance, the usage of historical land use and land cover information in research is still often hindered by the laborious visual interpretation, limiting the temporal and spatial coverage of studies. Thus, the core of the thesis is dedicated to the computational acquisition of geoinformation from archival map sources by means of digital image analysis. Based on a comprehensive review of literature as well as the data and proposed algorithms, two major challenges for long-term retrospective information acquisition and change detection were identified: first, the diversity of geographical entity representations over space and time, and second, the uncertainty inherent to both the data source itself and its utilization for land change detection. To address the former challenge, image segmentation is considered a global non-linear optimization problem. The segmentation methods and parameters are adjusted using a metaheuristic, evolutionary approach. For preserving adaptability in high level image analysis, a hybrid model- and data-driven strategy, combining a knowledge-based and a neural net classifier, is recommended. To address the second challenge, a probabilistic object- and field-based change detection approach for modeling the positional, thematic, and temporal uncertainty adherent to both data and processing, is developed. Experimental results indicate the suitability of the methodology in support of land change monitoring. In conclusion, potentials of application and directions for further research are given

    Flood Forecasting Using Machine Learning Methods

    Get PDF
    This book is a printed edition of the Special Issue Flood Forecasting Using Machine Learning Methods that was published in Wate

    Analyse von hydrologischen Extremereignissen unter BerĂĽcksichtigung von Unsicherheiten

    Get PDF
    In engineering practice for flood risk assessment it is of primary importance to provide an accurate design flood estimate corresponding to a given risk level. Developing efficient methodologies for assessing flood quantiles in ungauged river basins means to focus on Uncertainty Quantification (UQ). Uncertainty of the model parameters and observed measures is the subject of a relevant and ongoing research activity, in assessing the uncertainty in the design flood we deal with the uncertainty of the model output. In this thesis, the evaluations of the flood quantiles and the predictive uncertainty of these variables are provided by two different models. Within the framework of regional flood frequency analysis approaches, the Top-Kriging interpolation technique is used and the results are compared with the estimates of flood quantiles provided by an at site flood frequency analysis. Moreover, identification procedure of the uncertain parameters of the distributed hydrological model MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) was developed. Efficient tools to tackle the parameter identification and the evolution of uncertainty in hydrological modelling have been researched. Monte Carlo and related techniques, i.e. the sampling or ensembles procedures, are well-known, methods based on functional approximation, where the unknown Random Variables (RVs) are represented as functions of known and more simple independent RVs, are very recent and can help to accelerate the Bayesian update. In order to find the Bayesian solution of inverse problem, the Ensemble Kalman filter (EnKF) and Wiener’s Polynomial Chaos Expansion (PCE) methods are compared. The numerical evaluation of the analyzed Bayesian updating methods is carried out with reference to the hydrological model MOBIDIC. The proposed methodologies are applied to the case study of the Arno river basin, in Tuscany Region, Italy. The actual value of some model parameters is described in a Bayesian way through a probabilistic model: the parameters are considered as RVs, the impact of errors, or uncertainty, in the data are investigated. The quantification of the accuracy of the different models and the comparison of results from the interpolation techniques and from the hydrological model MOBIDIC are evaluated. Finally, a preliminary discussion on the ways to convey the results of UQ to stakeholders and to communicate the outcomes for flood risk assessment is carried out.In der technischen Praxis der Bewertung von Hochwasserrisikos ist es wichtig, eine präzise Wasserstandsvorhersage zusammen mit der Wahrscheinlichkeit ihres Auftretens zu liefern. Um effektive Methoden zur Bewertung von Hochwasserquantilen in Flüssen ohne Pegelmessung zu entwickeln, muss man den Fokus auf die Quantifizierung von Unsicherheit legen. Unsicherheiten bei Modellparametern und Messungen ist das Thema gegenwärtiger relevanter Forschungstätigkeit; bei der Einschätzung von Unsicherheiten in der Hochwassersimulation betrachten wir die Unsicherheit des Modells. In dieser Dissertation erfolgt die Evaluation der Quantile des Hochwasser und die Vorhersageunsicherheit dieser Variablen durch zwei unterschiedliche Modelle. Top Kriging Interpolationstechniken benutzt, und die Resulate werden mit Schätzungen der Hochwasserquantile verglichen, die durch eine Vorort-Hochwasserfrequenzanalyse gewonnen wurden. Zusätzlich wird das räumliche hydrologische Modell MOBIDIC, durch das die unsicheren Parameter identifiziert werden, eingeführt. Effiziente Werkzeuge zur Parameteridentifikation und Entwicklung von Unsicherheiten in hydrologischen Modellen werden identifiziert. Während Monte Carlo- und verwandte Techniken, z. B. Ensemble-Verfahren, wohlbekannt sind, sind Methoden der funktionalen Approximation, bei der die unbekannten Zufallsgrößen als Funktionen von bekannten und einfacheren Zufallsgrößen repräsentiert werden, relativ neu und können dazu dienen, Bayes´sche Aktualisierung zu beschleunigen. Um eine Bayes´sche Lösung des inversen Problems zu finden, werden der Ensemble Kalman Filter und Wieners Polynome Chaos-Entwicklungs verglichen. Die numerische Auswertung der analysierten Methoden zur Bayes´schen Aktualisierung wird in Bezug auf das hydrologische Modell MOBIDIC durchgeführt. Die Werte einiger Modellparameter werden auf Bayes´sche Weise anhand eines Wahrscheinlichkeitsmodells beschrieben: Die Parameter werden als Zufallsgrößen betrachtet und die Auswirkung von Fehlern oder Unsicherheiten bezüglich der Daten werden untersucht. Die Quantifikation der Genauigkeit der verschiedenen Modelle und der Vergleich der Resultate mit Hilfe von Interpolationstechniken und des MOBIDIC-Modells werden ausgewertet. Anschließend folgt eine vorläufige Diskussion über die Art und Weise, die Resultate der Quantifizierung von Unsicherheiten an Betroffene zu übermitteln und die Ergebnisse dieser Bewertung von Hochwasserrisiken zu veröffentlichen

    Explainable Physics-informed Deep Learning for Rainfall-runoff Modeling and Uncertainty Assessment across the Continental United States

    Get PDF
    Hydrologic models provide a comprehensive tool to calibrate streamflow response to environmental variables. Various hydrologic modeling approaches, ranging from physically based to conceptual to entirely data-driven models, have been widely used for hydrologic simulation. During the recent years, however, Deep Learning (DL), a new generation of Machine Learning (ML), has transformed hydrologic simulation research to a new direction. DL methods have recently proposed for rainfall-runoff modeling that complement both distributed and conceptual hydrologic models, particularly in a catchment where data to support a process-based model is scared and limited. This dissertation investigated the applicability of two advanced probabilistic physics-informed DL algorithms, i.e., deep autoregressive network (DeepAR) and temporal fusion transformer (TFT), for daily rainfall-runoff modeling across the continental United States (CONUS). We benchmarked our proposed models against several physics-based hydrologic approaches such as the Sacramento Soil Moisture Accounting Model (SAC-SMA), Variable Infiltration Capacity (VIC), Framework for Understanding Structural Errors (FUSE), Hydrologiska Byråns Vattenbalansavdelning (HBV), and the mesoscale hydrologic model (mHM). These benchmark models can be distinguished into two different groups. The first group are the models calibrated for each basin individually (e.g., SAC-SMA, VIC, FUSE2, mHM and HBV) while the second group, including our physics-informed approaches, is made up of the models that were regionally calibrated. Models in this group share one parameter set for all basins in the dataset. All the approaches were implemented and tested using Catchment Attributes and Meteorology for Large-sample Studies (CAMELS)\u27s Maurer datasets. We developed the TFT and DeepAR with two different configurations i.e., with (physics-informed model) and without (the original model) static attributes. Various catchment static and dynamic physical attributes were incorporated into the pipeline with various spatiotemporal variabilities to simulate how a drainage system responds to rainfall-runoff processes. To demonstrate how the model learned to differentiate between different rainfall–runoff behaviors across different catchments and to identify the dominant process, sensitivity and explainability analysis of modeling outcomes are also performed. Despite recent advancements, deep networks are perceived as being challenging to parameterize; thus, their simulation may propagate error and uncertainty in modeling. To address uncertainty, a quantile likelihood function was incorporated as the TFT loss function. The results suggest that the physics-informed TFT model was superior in predicting high and low flow fluctuations compared to the original TFT and DeepAR models (without static attributes) or even the physics-informed DeepAR. Physics-informed TFT model well recognized which static attributes more contributing to streamflow generation of each specific catchment considering its climate, topography, land cover, soil, and geological conditions. The interpretability and the ability of the physics-informed TFT model to assimilate the multisource of information and parameters make it a strong candidate for regional as well as continental-scale hydrologic simulations. It was noted that both physics-informed TFT and DeepAR were more successful in learning the intermediate flow and high flow regimes rather than the low flow regime. The advantage of the high flow can be attributed to learning a more generalizable mapping between static and dynamic attributes and runoff parameters. It seems both TFT and DeepAR may have enabled the learning of some true processes that are missing from both conceptual and physics-based models, possibly related to deep soil water storage (the layer where soil water is not sensitive to daily evapotranspiration), saturated hydraulic conductivity, and vegetation dynamics

    Statistics and Pattern Recognition Applied to the Spatio-Temporal Properties of Seismicity

    Get PDF
    Due to the significant increase in the availability of new data in recent years, as a result of the expansion of available seismic stations, laboratory experiments, and the availability of increasingly reliable synthetic catalogs, considerable progress has been made in understanding the spatiotemporal properties of earthquakes. The study of the preparatory phase of earthquakes and the analysis of past seismicity has led to the formulation of seismicity models for the forecasting of future earthquakes or to the development of seismic hazard maps. The results are tested and validated by increasingly accurate statistical methods. A relevant part of the development of many models is the correct identification of seismicity clusters and scaling laws of background seismicity. In this collection, we present eight innovative papers that address all the above topics. The occurrence of strong earthquakes (mainshocks) is analyzed from different perspectives in this Special Issue

    Using Kriging to Interpolate Spatially Distributed Volumetric Medical Data

    Get PDF
    Routine cases in diagnostic radiology require the interpolation of volumetric medical imaging data sets. Inaccurate renditions of interpolated volumes can lead to the misdiagnosis of a patient\u27s condition. It is therefore essential that interpolated modality space estimates accurately portray patient space. Kriging is investigated in this research to interpolate medical imaging volumes. Kriging requires data to be spatially distributed. Therefore, magnetic resonance imaging (MRI) data is shown to exhibit spatially regionalized characteristics such that it can be modeled using regionalized variables and subsequently be interpolated using kriging. A comprehensive, automated, three-dimensional structural analysis of the MRI data is accomplished to derive a mathematical model of spatial variation about each interpolated point. Kriging uses these models to compute estimates of minimal estimation variance. Estimation accuracy of the kriged, interpolated MRI volume is demonstrated to exceed that achieved using trilinear interpolation if the derived model of spatial variation sufficiently represents the regionalized neighborhoods about each interpolated voxel. Models of spatial variation that assume an ellipsoid extent with orthogonal axes of continuity are demonstrated to insufficiently characterize modality space MRI data. Model accuracy is concluded to be critical to achieve estimation accuracies that exceed those of trilinear interpolation

    Water Resources Management and Modeling

    Get PDF
    Hydrology is the science that deals with the processes governing the depletion and replenishment of water resources of the earth's land areas. The purpose of this book is to put together recent developments on hydrology and water resources engineering. First section covers surface water modeling and second section deals with groundwater modeling. The aim of this book is to focus attention on the management of surface water and groundwater resources. Meeting the challenges and the impact of climate change on water resources is also discussed in the book. Most chapters give insights into the interpretation of field information, development of models, the use of computational models based on analytical and numerical techniques, assessment of model performance and the use of these models for predictive purposes. It is written for the practicing professionals and students, mathematical modelers, hydrogeologists and water resources specialists

    Improvements on the bees algorithm for continuous optimisation problems

    Get PDF
    This work focuses on the improvements of the Bees Algorithm in order to enhance the algorithm’s performance especially in terms of convergence rate. For the first enhancement, a pseudo-gradient Bees Algorithm (PG-BA) compares the fitness as well as the position of previous and current bees so that the best bees in each patch are appropriately guided towards a better search direction after each consecutive cycle. This method eliminates the need to differentiate the objective function which is unlike the typical gradient search method. The improved algorithm is subjected to several numerical benchmark test functions as well as the training of neural network. The results from the experiments are then compared to the standard variant of the Bees Algorithm and other swarm intelligence procedures. The data analysis generally confirmed that the PG-BA is effective at speeding up the convergence time to optimum. Next, an approach to avoid the formation of overlapping patches is proposed. The Patch Overlap Avoidance Bees Algorithm (POA-BA) is designed to avoid redundancy in search area especially if the site is deemed unprofitable. This method is quite similar to Tabu Search (TS) with the POA-BA forbids the exact exploitation of previously visited solutions along with their corresponding neighbourhood. Patches are not allowed to intersect not just in the next generation but also in the current cycle. This reduces the number of patches materialise in the same peak (maximisation) or valley (minimisation) which ensures a thorough search of the problem landscape as bees are distributed around the scaled down area. The same benchmark problems as PG-BA were applied against this modified strategy to a reasonable success. Finally, the Bees Algorithm is revised to have the capability of locating all of the global optimum as well as the substantial local peaks in a single run. These multi-solutions of comparable fitness offers some alternatives for the decision makers to choose from. The patches are formed only if the bees are the fittest from different peaks by using a hill-valley mechanism in this so called Extended Bees Algorithm (EBA). This permits the maintenance of diversified solutions throughout the search process in addition to minimising the chances of getting trap. This version is proven beneficial when tested with numerous multimodal optimisation problems

    Earthquake Engineering

    Get PDF
    The book Earthquake Engineering - From Engineering Seismology to Optimal Seismic Design of Engineering Structures contains fifteen chapters written by researchers and experts in the fields of earthquake and structural engineering. This book provides the state-of-the-art on recent progress in the field of seimology, earthquake engineering and structural engineering. The book should be useful to graduate students, researchers and practicing structural engineers. It deals with seismicity, seismic hazard assessment and system oriented emergency response for abrupt earthquake disaster, the nature and the components of strong ground motions and several other interesting topics, such as dam-induced earthquakes, seismic stability of slopes and landslides. The book also tackles the dynamic response of underground pipes to blast loads, the optimal seismic design of RC multi-storey buildings, the finite-element analysis of cable-stayed bridges under strong ground motions and the acute psychiatric trauma intervention due to earthquakes
    • …
    corecore