7 research outputs found

    Sinkhole susceptibility mapping: A comparison between Bayes-based machine learning algorithms

    Get PDF
    Land degradation has been recognized as one of the most adverse environmental impacts during the last century. The occurrence of sinkholes is increasing dramatically in many regions worldwide contributing to land degradation. The rise in the sinkhole frequency is largely due to human-induced hydrological alterations that favour dissolution and subsidence processes. Mitigating detrimental impacts associated with sinkholes requires understanding different aspects of this phenomenon such as the controlling factors and the spatial distribution patterns. This research illustrates the development and validation of sinkhole susceptibility models in Hamadan Province, Iran, where a large number of sinkholes are occurring under poorly understood circumstances. Several susceptibility models were developed with a training sample of sinkholes, a number of conditioning factors, and four different statistical approaches: naïve Bayes, Bayes net (BN), logistic regression, and Bayesian logistic regression. Ten conditioning factors were initially considered. Factors with negligible contribution to the quality of predictions, according to the information gain ratio technique, were discarded for the development of the final models. The validation of susceptibility models, performed using different statistical indices and receiver operating characteristic curves, revealed that the BN model has the highest prediction capability in the study area. This model provides reliable predictions on the future distribution of sinkholes, which can be used by watershed and land use managers for designing hazard and land-degradation mitigation plans

    Investigating the use of multi-label classification methods for the purpose of classifying electromyographic signals

    Get PDF
    The type of pattern recognition methods used for controlling modern prosthetics, referred to here as single-label classification methods, restricts users to a small amount of movements. One prominent reason for this is that the accuracy of these classification methods decreases as the number of allowed movements is increased. In this work a possible solution to this problem is presented by looking into the use of multi-label classification for classifying electromyographic signals. This was accomplished by going through the process of recording, processing, and classifying electromyographic data. In order to compare the performance of multi-label methods to that of single-label methods four classification methods from each category were selected. Both categories were then tested on their ability to classify finger flexion movements. The most commonly tested set of movements were the thumb, index, long, and ring finger movements in addition to all the possible combinations of these four fingers. The two categories were also tested on their ability to learn finger combination movements when only individual finger movements were used as training data. The results show that the tested single- and multi-label methods obtain similar classification accuracy when the training data consists of both individual finger movements and finger combination movements. The results also show that none of the tested single-label methods and only one of the tested multi-label methods, multi-label rbf neural networks, manages to learn finger combination movements when trained on only individual finger movements.Using multi-label classification methods to classify finger movements for hand prosthesis control Losing a limb is a traumatic experience that greatly impacts a person’s quality of life. To help the people who have suffered limb loss prosthetic devices were invented. The purpose of a prosthetic device is to mimic the function of the missing limb..

    Landslide susceptibility mapping using remote sensing data and geographic information system-based algorithms

    Get PDF
    Whether they occur due to natural triggers or human activities, landslides lead to loss of life and damages to properties which impact infrastructures, road networks and buildings. Landslide Susceptibility Map (LSM) provides the policy and decision makers with some valuable information. This study aims to detect landslide locations by using Sentinel-1 data, the only freely available online Radar imagery, and to map areas prone to landslide using a novel algorithm of AB-ADTree in Cameron Highlands, Pahang, Malaysia. A total of 152 landslide locations were detected by using integration of Interferometry Synthetic Aperture RADAR (InSAR) technique, Google Earth (GE) images and extensive field survey. However, 80% of the data were employed for training the machine learning algorithms and the remaining 20% for validation purposes. Seventeen triggering and conditioning factors, namely slope, aspect, elevation, distance to road, distance to river, proximity to fault, road density, river density, Normalized Difference Vegetation Index (NDVI), rainfall, land cover, lithology, soil types, curvature, profile curvature, Stream Power Index (SPI) and Topographic Wetness Index (TWI), were extracted from satellite imageries, digital elevation model (DEM), geological and soil maps. These factors were utilized to generate landslide susceptibility maps using Logistic Regression (LR) model, Logistic Model Tree (LMT), Random Forest (RF), Alternating Decision Tree (ADTree), Adaptive Boosting (AdaBoost) and a novel hybrid model from ADTree and AdaBoost models, namely AB-ADTree model. The validation was based on area under the ROC curve (AUC) and statistical measurements of Positive Predictive Value (PPV), Negative Predictive Value (NPV), sensitivity, specificity, accuracy and Root Mean Square Error (RMSE). The results showed that AUC was 90%, 92%, 88%, 59%, 96% and 94% for LR, LMT, RF, ADTree, AdaBoost and AB-ADTree algorithms, respectively. Non-parametric evaluations of the Friedman and Wilcoxon were also applied to assess the models’ performance: the findings revealed that ADTree is inferior to the other models used in this study. Using a handheld Global Positioning System (GPS), field study and validation were performed for almost 20% (30 locations) of the detected landslide locations and the results revealed that the landslide locations were correctly detected. In conclusion, this study can be applicable for hazard mitigation purposes and regional planning

    Application of Satellite Remote Sensing to Water Quality and Pathogenic Bacteria Prediction in the Chesapeake Bay

    Get PDF
    The Chesapeake Bay is home to an extensive in situ sampling campaign that has provided water quality measurements over multiple decades, aiding in the detection and regulation of environmental conditions that affect aquatic life, public health, and local economies. However, the current bi-monthly sampling can lack the temporal and spatial coverage needed for monitoring and modeling dynamic estuarine systems. While the time and cost of obtaining additional in situ samples can exceed available resources, satellite remote sensing has the potential to provide this higher temporal and spatial resolution data. The objective of this dissertation is to investigate the use of satellite remote sensing in the Chesapeake Bay for both water quality monitoring and the prediction of a naturally-occurring pathogenic bacterium, Vibrio parahaemolyticus, that is a leading cause of food-born illness. The dissertation does this by exploring the use of multispectral information to improve satellite-derived total suspended solids concentration and the potential for remotely sensed water quality products to predict V. parahaemolyticus in the Chesapeake Bay. In addition, the dissertation uses the application of remote sensing for V. parahaemolyticus prediction as a case study to present a prospective tool for communicating predictive model uncertainty to environmental management decision-makers and end-users. The work in this dissertation provides insights and recommendations that can aid in future development of operational models for water quality parameters or bacterial pathogens that incorporate remotely sensed data. As the effects of poor water quality are better understood and the incidence of Vibrio illness increases, improved operational models and uncertainty communication will become progressively important for protecting public and ecosystem health

    Recent advances in low-cost particulate matter sensor: calibration and application

    Get PDF
    Particulate matter (PM) has been monitored routinely due to its negative effects on human health and atmospheric visibility. Standard gravimetric measurements and current commercial instruments for field measurements are still expensive and laborious. The high cost of conventional instruments typically limits the number of monitoring sites, which in turn undermines the accuracy of real-time mapping of sources and hotspots of air pollutants with insufficient spatial resolution. The new trends of PM concentration measurement are personalized portable devices for individual customers and networking of large quantity sensors to meet the demand of Big Data. Therefore, low-cost PM sensors have been studied extensively due to their price advantage and compact size. These sensors have been considered as a good supplement of current monitoring sites for high spatial-temporal PM mapping. However, a large concern is the accuracy of these low-cost PM sensors. Multiple types of low-cost PM sensors and monitors were calibrated against reference instruments. All these units demonstrated high linearity against reference instruments with high R2 values for different types of aerosols over a wide range of concentration levels. The question of whether low-cost PM monitors can be considered as a substituent of conventional instruments was discussed, together with how to qualitatively describe the improvement of data quality due to calibrations. A limitation of these sensors and monitors is that their outputs depended highly on particle composition and size, resulting in as high as 10 times difference in the sensor outputs. Optical characterization of low-cost PM sensors (ensemble measurement) was conducted by combining experimental results with Mie scattering theory. The reasons for their dependence on the PM composition and size distribution were studied. To improve accuracy in estimation of mass concentration, an expression for K as a function of the geometric mean diameter, geometric standard deviation, and refractive index is proposed. To get rid of the influence of the refractive index, we propose a new design of a multi-wavelength sensor with a robust data inversion routine to estimate the PM size distribution and refractive index simultaneously. The utility of the networked system with improved sensitivity was demonstrated by deploying it in a woodworking shop. Data collected by the networked system was utilized to construct spatiotemporal PM concentration distributions using an ordinary Kriging method and an Artificial Neural Network model to elucidate particle generation and ventilation processes. Furthermore, for the outdoor environment, data reported by low-cost sensors were compared against satellite data. The remote sensing data could provide a daily calibration of these low-cost sensors. On the other hand, low-cost PM sensors could provide better accuracy to demonstrate the microenvironment

    Spatio-temporal forecasting of network data

    Get PDF
    In the digital age, data are collected in unprecedented volumes on a plethora of networks. These data provide opportunities to develop our understanding of network processes by allowing data to drive method, revealing new and often unexpected insights. To date, there has been extensive research into the structure and function of complex networks, but there is scope for improvement in modelling the spatio-temporal evolution of network processes in order to forecast future conditions. This thesis focusses on forecasting using data collected on road networks. Road traffic congestion is a serious and persistent problem in most major cities around the world, and it is the task of researchers and traffic engineers to make use of voluminous traffic data to help alleviate congestion. Recently, spatio-temporal models have been applied to traffic data, showing improvements over time series methods. Although progress has been made, challenges remain. Firstly, most existing methods perform well under typical conditions, but less well under atypical conditions. Secondly, existing spatio-temporal models have been applied to traffic data with high spatial resolution, and there has been little research into how to incorporate spatial information on spatially sparse sensor networks, where the dependency relationships between locations are uncertain. Thirdly, traffic data is characterised by high missing rates, and existing methods are generally poorly equipped to deal with this in a real time setting. In this thesis, a local online kernel ridge regression model is developed that addresses these three issues, with application to forecasting of travel times collected by automatic number plate recognition on London’s road network. The model parameters can vary spatially and temporally, allowing it to better model the time varying characteristics of traffic data, and to deal with abnormal traffic situations. Methods are defined for linking the spatially sparse sensor network to the physical road network, providing an improved representation of the spatial relationship between sensor locations. The incorporation of the spatio-temporal neighbourhood enables the model to forecast effectively under missing data. The proposed model outperforms a range of benchmark models at forecasting under normal conditions, and under various missing data scenarios
    corecore