328 research outputs found

    Uncertainty quantification of granular computing‑neural network model for prediction of pollutant longitudinal dispersion coefficient in aquatic streams

    Get PDF
    Discharge of pollution loads into natural water systems remains a global challenge that threatens water and food supply, as well as endangering ecosystem services. Natural rehabilitation of contaminated streams is mainly influenced by the longitudinal dispersion coefficient, or the rate of longitudinal dispersion (Dx), a key parameter with large spatiotemporal fluctuations that characterizes pollution transport. The large uncertainty in estimation of Dx in streams limits the water quality assessment in natural streams and design of water quality enhancement strategies. This study develops an artificial intelligence-based predictive model, coupling granular computing and neural network models (GrC-ANN) to provide robust estimation of Dx and its uncertainty for a range of flow-geometric conditions with high spatiotemporal variability. Uncertainty analysis of Dx estimated from the proposed GrC-ANN model was performed by alteration of the training data used to tune the model. Modified bootstrap method was employed to generate different training patterns through resampling from a global database of tracer experiments in streams with 503 datapoints. Comparison between the Dx values estimated by GrC-ANN to those determined from tracer measurements shows the appropriateness and robustness of the proposed method in determining the rate of longitudinal dispersion. The GrC-ANN model with the narrowest bandwidth of estimated uncertainty (bandwidth-factor = 0.56) that brackets the highest percentage of true Dx data (i.e., 100%) is the best model to compute Dx in streams. Considering the significant inherent uncertainty reported in the previous Dx models, the GrC-ANN model developed in this study is shown to have a robust performance for evaluating pollutant mixing (Dx) in turbulent environmental flow systems

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Landslide susceptibility mapping using remote sensing data and geographic information system-based algorithms

    Get PDF
    Whether they occur due to natural triggers or human activities, landslides lead to loss of life and damages to properties which impact infrastructures, road networks and buildings. Landslide Susceptibility Map (LSM) provides the policy and decision makers with some valuable information. This study aims to detect landslide locations by using Sentinel-1 data, the only freely available online Radar imagery, and to map areas prone to landslide using a novel algorithm of AB-ADTree in Cameron Highlands, Pahang, Malaysia. A total of 152 landslide locations were detected by using integration of Interferometry Synthetic Aperture RADAR (InSAR) technique, Google Earth (GE) images and extensive field survey. However, 80% of the data were employed for training the machine learning algorithms and the remaining 20% for validation purposes. Seventeen triggering and conditioning factors, namely slope, aspect, elevation, distance to road, distance to river, proximity to fault, road density, river density, Normalized Difference Vegetation Index (NDVI), rainfall, land cover, lithology, soil types, curvature, profile curvature, Stream Power Index (SPI) and Topographic Wetness Index (TWI), were extracted from satellite imageries, digital elevation model (DEM), geological and soil maps. These factors were utilized to generate landslide susceptibility maps using Logistic Regression (LR) model, Logistic Model Tree (LMT), Random Forest (RF), Alternating Decision Tree (ADTree), Adaptive Boosting (AdaBoost) and a novel hybrid model from ADTree and AdaBoost models, namely AB-ADTree model. The validation was based on area under the ROC curve (AUC) and statistical measurements of Positive Predictive Value (PPV), Negative Predictive Value (NPV), sensitivity, specificity, accuracy and Root Mean Square Error (RMSE). The results showed that AUC was 90%, 92%, 88%, 59%, 96% and 94% for LR, LMT, RF, ADTree, AdaBoost and AB-ADTree algorithms, respectively. Non-parametric evaluations of the Friedman and Wilcoxon were also applied to assess the models’ performance: the findings revealed that ADTree is inferior to the other models used in this study. Using a handheld Global Positioning System (GPS), field study and validation were performed for almost 20% (30 locations) of the detected landslide locations and the results revealed that the landslide locations were correctly detected. In conclusion, this study can be applicable for hazard mitigation purposes and regional planning

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of “volunteer mappers”. Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protection

    Recommendations for the quantitative analysis of landslide risk

    Get PDF
    This paper presents recommended methodologies for the quantitative analysis of landslide hazard, vulnerability and risk at different spatial scales (site-specific, local, regional and national), as well as for the verification and validation of the results. The methodologies described focus on the evaluation of the probabilities of occurrence of different landslide types with certain characteristics. Methods used to determine the spatial distribution of landslide intensity, the characterisation of the elements at risk, the assessment of the potential degree of damage and the quantification of the vulnerability of the elements at risk, and those used to perform the quantitative risk analysis are also described. The paper is intended for use by scientists and practising engineers, geologists and other landslide experts.JRC.H.5-Land Resources Managemen

    Coastal management and adaptation: an integrated data-driven approach

    Get PDF
    Coastal regions are some of the most exposed to environmental hazards, yet the coast is the preferred settlement site for a high percentage of the global population, and most major global cities are located on or near the coast. This research adopts a predominantly anthropocentric approach to the analysis of coastal risk and resilience. This centres on the pervasive hazards of coastal flooding and erosion. Coastal management decision-making practices are shown to be reliant on access to current and accurate information. However, constraints have been imposed on information flows between scientists, policy makers and practitioners, due to a lack of awareness and utilisation of available data sources. This research seeks to tackle this issue in evaluating how innovations in the use of data and analytics can be applied to further the application of science within decision-making processes related to coastal risk adaptation. In achieving this aim a range of research methodologies have been employed and the progression of topics covered mark a shift from themes of risk to resilience. The work focuses on a case study region of East Anglia, UK, benefiting from the input of a partner organisation, responsible for the region’s coasts: Coastal Partnership East. An initial review revealed how data can be utilised effectively within coastal decision-making practices, highlighting scope for application of advanced Big Data techniques to the analysis of coastal datasets. The process of risk evaluation has been examined in detail, and the range of possibilities afforded by open source coastal datasets were revealed. Subsequently, open source coastal terrain and bathymetric, point cloud datasets were identified for 14 sites within the case study area. These were then utilised within a practical application of a geomorphological change detection (GCD) method. This revealed how analysis of high spatial and temporal resolution point cloud data can accurately reveal and quantify physical coastal impacts. Additionally, the research reveals how data innovations can facilitate adaptation through insurance; more specifically how the use of empirical evidence in pricing of coastal flood insurance can result in both communication and distribution of risk. The various strands of knowledge generated throughout this study reveal how an extensive range of data types, sources, and advanced forms of analysis, can together allow coastal resilience assessments to be founded on empirical evidence. This research serves to demonstrate how the application of advanced data-driven analytical processes can reduce levels of uncertainty and subjectivity inherent within current coastal environmental management practices. Adoption of methods presented within this research could further the possibilities for sustainable and resilient management of the incredibly valuable environmental resource which is the coast

    A consensus-based approach for structural resilience to earthquakes using machine learning techniques

    Get PDF
    Seismic hazards represent a constant threat for both the built environment but mainly for human lives. Past approaches to seismic engineering considered the building deformability as limited to the elastic behaviour. Following to the introduction of performance-based approaches a whole new methodology for seismic design and assessment was proposed, relying on the ability of a building to extend its deformability in the plastic domain. This links to the ability of the building to undergo large deformations but still withstand it and therefore safeguard human lives. This allowed to distinguish between transient and permanent deformations when undergoing dynamic (e.g., seismic) stresses. In parallel, a whole new discipline is flourishing, which sees traditional structural analysis methods coupled to Artificial Intelligence (AI) strategies. In parallel, the emerging discipline of resilience has been widely implemented in the domain of disaster management and also in structural engineering. However, grounding on an extensive literature review, current approaches to disaster management at the building and district level exhibit a significant fragmentation in terms of strategies of objectives, highlighting the urge for a more holistic conceptualization. The proposed methodology therefore aims at addressing both the building and district levels, by the adoption of scale-specific methodologies suitable for the scale of analysis. At the building level, an analytical three-stage methodology is proposed to enhance traditional investigation and structural optimization strategies by the utilization of object-oriented programming, evolutionary computing and deep learning techniques. This is validated throughout the application of the proposed methodology on a real building in Old Beichuan, which underwent seismically-triggered damages as a result of the 2008 Wenchuan Earthquake. At the district scale, a so-called qualitative methodology is proposed to attain a resilience evaluation in face of geo-environmental hazards and specifically targeting the built environment. A Delphi expert consultation is adopted and a framework is presented. To combine the two scales, a high-level strategy is ultimately proposed in order to interlace the building and district-scale simulations to make them organically interlinked. To this respect, a multi-dimensional mapping of the area of Old-Beichuan is presented to aid the identification of some key indicators of the district-level framework. The research has been conducted in the context of the REACH project, `vi investigating the built environment’s resilience in face of seismically-triggered geo-environmental hazards in the context of the 2008 Wenchuan Earthquake in China. Results show that an optimized performance-based approach would significantly enhance traditional analysis and investigation strategies, providing an approximate damage reduction of 25% with a cost increase of 20%. In addition, the utilization of deep learning techniques to replace traditional simulation engine proved to attain a result precision up to 98%, making it reliable to conduct investigation campaign in relation to specific building features when traditional methods fail due to the impossibility of either accessing the building or tracing pertinent documentation. It is therefore demonstrated how sometimes challenging regulatory frameworks is a necessary step to enhance the resilience of buildings in face of seismic hazards

    Remote Sensing of Natural Hazards

    Get PDF
    Each year, natural hazards such as earthquakes, cyclones, flooding, landslides, wildfires, avalanches, volcanic eruption, extreme temperatures, storm surges, drought, etc., result in widespread loss of life, livelihood, and critical infrastructure globally. With the unprecedented growth of the human population, largescale development activities, and changes to the natural environment, the frequency and intensity of extreme natural events and consequent impacts are expected to increase in the future.Technological interventions provide essential provisions for the prevention and mitigation of natural hazards. The data obtained through remote sensing systems with varied spatial, spectral, and temporal resolutions particularly provide prospects for furthering knowledge on spatiotemporal patterns and forecasting of natural hazards. The collection of data using earth observation systems has been valuable for alleviating the adverse effects of natural hazards, especially with their near real-time capabilities for tracking extreme natural events. Remote sensing systems from different platforms also serve as an important decision-support tool for devising response strategies, coordinating rescue operations, and making damage and loss estimations.With these in mind, this book seeks original contributions to the advanced applications of remote sensing and geographic information systems (GIS) techniques in understanding various dimensions of natural hazards through new theory, data products, and robust approaches

    Recommendations for the quantitative analysis of landslide risk

    Get PDF
    This paper presents recommended methodologies for the quantitative analysis of landslide hazard, vulnerability and risk at different spatial scales (site-specific, local, regional and national), as well as for the verification and validation of the results. The methodologies described focus on the evaluation of the probabilities of occurrence of different landslide types with certain characteristics. Methods used to determine the spatial distribution of landslide intensity, the characterisation of the elements at risk, the assessment of the potential degree of damage and the quantification of the vulnerability of the elements at risk, and those used to perform the quantitative risk analysis are also described. The paper is intended for use by scientists and practising engineers, geologists and other landslide experts.Peer ReviewedPostprint (published version
    corecore