865 research outputs found

    \textsc{DeFault}: Deep-learning-based Fault Delineation

    Full text link
    The carbon capture, utilization, and storage (CCUS) framework is an essential component in reducing greenhouse gas emissions, with its success hinging on the comprehensive knowledge of subsurface geology and geomechanics. Passive seismic event relocation and fault detection serve as indispensable tools, offering vital insights into subsurface structures and fluid migration pathways. Accurate identification and localization of seismic events, however, face significant challenges, including the necessity for high-quality seismic data and advanced computational methods. To address these challenges, we introduce a novel deep learning method, DeFault, specifically designed for passive seismic source relocation and fault delineating for passive seismic monitoring projects. By leveraging data domain-adaptation, DeFault allows us to train a neural network with labeled synthetic data and apply it directly to field data. Using DeFault, the passive seismic sources are automatically clustered based on their recording time and spatial locations, and subsequently, faults and fractures are delineated accordingly. We demonstrate the efficacy of DeFault on a field case study involving CO2 injection related microseismic data from the Decatur, Illinois area. Our approach accurately and efficiently relocated passive seismic events, identified faults and aided in the prevention of potential geological hazards. Our results highlight the potential of DeFault as a valuable tool for passive seismic monitoring, emphasizing its role in ensuring CCUS project safety. This research bolsters the understanding of subsurface characterization in CCUS, illustrating machine learning's capacity to refine these methods. Ultimately, our work bear significant implications for CCUS technology deployment, an essential strategy in combating climate change

    A New-Fangled FES-k-Means Clustering Algorithm for Disease Discovery and Visual Analytics

    Get PDF
    <p/> <p>The central purpose of this study is to further evaluate the quality of the performance of a new algorithm. The study provides additional evidence on this algorithm that was designed to increase the overall efficiency of the original <it>k</it>-means clustering technique&#8212;the Fast, Efficient, and Scalable <it>k</it>-means algorithm (<it>FES-k</it>-means). The <it>FES-k</it>-means algorithm uses a hybrid approach that comprises the <it>k-d</it> tree data structure that enhances the nearest neighbor query, the original <it>k</it>-means algorithm, and an adaptation rate proposed by Mashor. This algorithm was tested using two real datasets and one synthetic dataset. It was employed twice on all three datasets: once on data trained by the innovative MIL-SOM method and then on the actual untrained data in order to evaluate its competence. This two-step approach of data training prior to clustering provides a solid foundation for knowledge discovery and data mining, otherwise unclaimed by clustering methods alone. The benefits of this method are that it produces clusters similar to the original <it>k</it>-means method at a much faster rate as shown by runtime comparison data; and it provides efficient analysis of large geospatial data with implications for disease mechanism discovery. From a disease mechanism discovery perspective, it is hypothesized that the linear-like pattern of elevated blood lead levels discovered in the city of Chicago may be spatially linked to the city's water service lines.</p

    Human and environmental exposure to hydrocarbon pollution in the Niger Delta:A geospatial approach

    Get PDF
    This study undertook an integrated geospatial assessment of human and environmental exposure to oil pollution in the Niger Delta using primary and secondary spatial data. This thesis begins by presenting a clear rationale for the study of extensive oil pollution in the Niger Delta, followed by a critical literature review of the potential application of geospatial techniques for monitoring and managing the problem. Three analytical chapters report on the methodological developments and applications of geospatial techniques that contribute to achieving the aim of the study. Firstly, a quantitative assessment of human and environmental exposure to oil pollution in the Niger Delta was performed using a government spill database. This was carried out using Spatial Analysis along Networks (SANET), a geostatistical tool, since oil spills in the region tend to follow the linear patterns of the pipelines. Spatial data on pipelines, oil spills, population and land cover data were analysed in order to quantify the extent of human and environmental exposure to oil pollution. The major causes of spills and spatial factors potentially reinforcing reported causes were analysed. Results show extensive general exposure and sabotage as the leading cause of oil pollution in the Niger Delta. Secondly, a method of delineating the river network in the Niger Delta using Sentinel-1 SAR data was developed, as a basis for modelling potential flow of pollutants in the distributary pathways of the network. The cloud penetration capabilities of SAR sensing are particularly valuable for this application since the Niger Delta is notorious for cloud cover. Vector and raster-based river networks derived from Sentinel-1 were compared to alternative river map products including those from the USGS and ESA. This demonstrated the superiority of the Sentinel-1 derived river network, which was subsequently used in a flow routing analysis to demonstrate the potential for understanding oil spill dispersion. Thirdly, the study applied optical remote sensing for indirect detection and mapping of oil spill impacts on vegetation. Multi-temporal Landsat data was used to delineate the spill impact footprint of a notable 2008 oil spill incident in Ogoniland and population exposure was evaluated. The optical data was effective in impact area delineation, demonstrating extensive and long-lasting population exposure to oil pollution. Overall, this study has successfully assembled and produced relevant spatial and attribute data sets and applied integrated geostatistical analytical techniques to understand the distribution and impacts of oil spills in the Niger Delta. The study has revealed the extensive level of human and environmental exposure to hydrocarbon pollution in the Niger Delta and introduced new methods that will be valuable fo

    Doctor of Philosophy

    Get PDF
    dissertationThere is a need to improve the methods involved with targeted implementation and design of distributed, watershed-scale low impact development (LID) practices. The goal of this dissertation was to improve the targeted implementation and design of distrib

    Assessment of earthquake-triggered landslides in Central Nepal

    Full text link
    Landslides are recurrent in Nepal due to active tectonics, high precipitation, complex topography, geology, and land use practices. Reliable landslide susceptibility maps are crucial for effective disaster management. Ongoing research has improved landslide mapping approaches, while further efforts are needed to assess inventories and enhance susceptibility mapping methods. This thesis aims to evaluate the landslides caused by the Gorkha earthquake in 2015 and develop reliable landslide susceptibility maps using statistical and geospatial techniques. There are four main objectives: (i) proposing clustering-based sampling strategies to increase the efficiency of landslide susceptibility maps over random selection methods, (ii) identifying and delineating effective landslide mapping units, (iii) proposing an innovative framework for comparing inventories and their corresponding susceptibility maps, and (iv) implementing a methodology for landslide-specific susceptibility mapping. Firstly, a comprehensive Gorkha earthquake-induced landslide inventory was initially compiled, and six unsupervised clustering algorithms were employed to generate six distinct training datasets. An additional training dataset was also prepared using a randomised approach. Among the tested algorithms, the Expectation Maximization using the Gaussian Mixture Model (EM/GMM) demonstrated the highest accuracy, confirming the importance of prioritising clustering patterns for training landslide inventory datasets. Secondly, slope units were introduced as an effective mapping unit for assessing landslides, delineating 112,674 slope unit polygons over an approximately 43,000 km2 area in Central Nepal. This is the first instance of generating such comprehensive mapping and making it publicly accessible. Thirdly, a comparison of five post-Gorkha earthquake inventories and susceptibility was conducted, revealing similarities in causative factors and map performance but variations in spatial patterns. Lastly, a rockfall inventory along two significant highways was developed as a landslide-classified inventory, and the rockfall susceptibility was evaluated. A segment-wise map with a 1 to 5 scale indicating low to high susceptibility was published for public use. This thesis proposes new approaches to landslide inventory sampling and earthquake-triggered landslide assessment. It provides publicly accessible databases for Central Nepal's slope unit map and rockfall susceptibility along the major highways. These findings can benefit researchers, planners, and policymakers to enhance risk management practices by advancing landslide assessment, particularly for earthquake-induced landslides in Central Nepal

    Hydrography90m: a new high-resolution global hydrographic dataset

    Get PDF
    The geographic distribution of streams and rivers drives a multitude of patterns and processes in hydrology, geomorphology, geography, and ecology. Therefore, a hydrographic network that accurately delineates both small streams and large rivers, along with their topographic and topological properties, with equal precision would be indispensable in the earth sciences. Currently, available global hydrographies do not feature small headwater streams in great detail. However, these headwaters are vital because they are estimated to contribute to more than 70 % of overall stream length. We aimed to fill this gap by using the MERIT Hydro digital elevation model at 3 arcsec (∼90 m at the Equator) to derive a globally seamless, standardised hydrographic network, the “Hydrography90m”, with corresponding stream topographic and topological information. A central feature of the network is the minimal upstream contributing area, i.e. flow accumulation, of 0.05 km2 (or 5 ha) to initiate a stream channel, which allowed us to extract headwater stream channels in great detail. By employing a suite of GRASS GIS hydrological modules, we calculated the range-wide upstream flow accumulation and flow direction to delineate a total of 1.6 million drainage basins and extracted globally a total of 726 million unique stream segments with their corresponding sub-catchments. In addition, we computed stream topographic variables comprising stream slope, gradient, length, and curvature attributes as well as stream topological variables to allow for network routing and various stream order classifications. We validated the spatial accuracy and flow accumulation of Hydrography90m against NHDPlus HR, an independent, national high-resolution hydrographic network dataset of the United States. Our validation shows that the newly developed Hydrography90m has the highest spatial precision and contains more headwater stream channels compared to three other global hydrographic datasets. This comprehensive approach provides a vital and long-overdue baseline for assessing actual streamflow in headwaters and opens new research avenues for high-resolution studies of surface water worldwide. Hydrography90m thus offers significant potential to facilitate the assessment of freshwater quantity and quality, inundation risk, biodiversity, conservation, and resource management objectives in a globally comprehensive and standardised manner. The Hydrography90m layers are available at https://doi.org/10.18728/igb-fred-762.1 (Amatulli et al., 2022a), and while they can be used directly in standard GIS applications, we recommend the seamless integration with hydrological modules in open-source QGIS and GRASS GIS software to further customise the data and derive optimal utility from it

    MULTI-SCALE DYNAMIC PARTITIONING SYSTEM OF URBAN SPATIAL UNITS

    Get PDF
    As the spatial structure of cities becomes increasingly complex and sustainable development goals are promoted, society places higher demands on the management and planning of cities. As the basic unit of urban analysis, the combination pattern and scale shape of urban spatial units are crucial for rational management and planning of cities. However, existing urban analysis management systems often adopt a prefabricated fixed cell division method, which is difficult to meet the needs of high precision and multi-scale analysis of urban information. Therefore, this paper proposes an interactive dynamic partitioning technology, and designs a multi-scale dynamic partitioning system for urban spatial units (SUPS), in order to meet the diverse needs of urban management and planning. The system consists of a data management module, a spatial unit module, an integration module and a visualization module. The system not only realises the multi-scale dynamic partitioning of spatial units in the form of interactive operation, but also obtains more detailed identification results by applying the multi-scale spatial units to the identification of urban functional areas, verifying the effectiveness and feasibility of the interactive multi-scale dynamic partitioning of spatial units, and providing a new technical support for fine urban management

    Airborne LiDAR for DEM generation: some critical issues

    Get PDF
    Airborne LiDAR is one of the most effective and reliable means of terrain data collection. Using LiDAR data for DEM generation is becoming a standard practice in spatial related areas. However, the effective processing of the raw LiDAR data and the generation of an efficient and high-quality DEM remain big challenges. This paper reviews the recent advances of airborne LiDAR systems and the use of LiDAR data for DEM generation, with special focus on LiDAR data filters, interpolation methods, DEM resolution, and LiDAR data reduction. Separating LiDAR points into ground and non-ground is the most critical and difficult step for DEM generation from LiDAR data. Commonly used and most recently developed LiDAR filtering methods are presented. Interpolation methods and choices of suitable interpolator and DEM resolution for LiDAR DEM generation are discussed in detail. In order to reduce the data redundancy and increase the efficiency in terms of storage and manipulation, LiDAR data reduction is required in the process of DEM generation. Feature specific elements such as breaklines contribute significantly to DEM quality. Therefore, data reduction should be conducted in such a way that critical elements are kept while less important elements are removed. Given the highdensity characteristic of LiDAR data, breaklines can be directly extracted from LiDAR data. Extraction of breaklines and integration of the breaklines into DEM generation are presented

    Google Earth Engine cloud computing platform for remote sensing big data applications: a comprehensive review

    Get PDF
    Remote sensing (RS) systems have been collecting massive volumes of datasets for decades, managing and analyzing of which are not practical using common software packages and desktop computing resources. In this regard, Google has developed a cloud computing platform, called Google Earth Engine (GEE), to effectively address the challenges of big data analysis. In particular, this platformfacilitates processing big geo data over large areas and monitoring the environment for long periods of time. Although this platformwas launched in 2010 and has proved its high potential for different applications, it has not been fully investigated and utilized for RS applications until recent years. Therefore, this study aims to comprehensively explore different aspects of the GEE platform, including its datasets, functions, advantages/limitations, and various applications. For this purpose, 450 journal articles published in 150 journals between January 2010 andMay 2020 were studied. It was observed that Landsat and Sentinel datasets were extensively utilized by GEE users. Moreover, supervised machine learning algorithms, such as Random Forest, were more widely applied to image classification tasks. GEE has also been employed in a broad range of applications, such as Land Cover/land Use classification, hydrology, urban planning, natural disaster, climate analyses, and image processing. It was generally observed that the number of GEE publications have significantly increased during the past few years, and it is expected that GEE will be utilized by more users from different fields to resolve their big data processing challenges.Peer ReviewedPostprint (published version
    corecore