7,590 research outputs found

    Short-Term Forecasting of Passenger Demand under On-Demand Ride Services: A Spatio-Temporal Deep Learning Approach

    Full text link
    Short-term passenger demand forecasting is of great importance to the on-demand ride service platform, which can incentivize vacant cars moving from over-supply regions to over-demand regions. The spatial dependences, temporal dependences, and exogenous dependences need to be considered simultaneously, however, which makes short-term passenger demand forecasting challenging. We propose a novel deep learning (DL) approach, named the fusion convolutional long short-term memory network (FCL-Net), to address these three dependences within one end-to-end learning architecture. The model is stacked and fused by multiple convolutional long short-term memory (LSTM) layers, standard LSTM layers, and convolutional layers. The fusion of convolutional techniques and the LSTM network enables the proposed DL approach to better capture the spatio-temporal characteristics and correlations of explanatory variables. A tailored spatially aggregated random forest is employed to rank the importance of the explanatory variables. The ranking is then used for feature selection. The proposed DL approach is applied to the short-term forecasting of passenger demand under an on-demand ride service platform in Hangzhou, China. Experimental results, validated on real-world data provided by DiDi Chuxing, show that the FCL-Net achieves better predictive performance than traditional approaches including both classical time-series prediction models and neural network based algorithms (e.g., artificial neural network and LSTM). This paper is one of the first DL studies to forecast the short-term passenger demand of an on-demand ride service platform by examining the spatio-temporal correlations.Comment: 39 pages, 10 figure

    Development of a machine learning based methodology for bridge health monitoring

    Get PDF
    Tesi en modalitat de compendi de publicacionsIn recent years the scientific community has been developing new techniques in structural health monitoring (SHM) to identify the damages in civil structures specially in bridges. The bridge health monitoring (BHM) systems serve to reduce overall life-cycle maintenance costs for bridges, as their main objective is to prevent catastrophic failures and damages. In the BHM using dynamic data, there are several problems related to the post-processing of the vibration signals such as: (i) when the modal-based dynamic features like natural frequencies, modes shape and damping are used, they present a limitation in relation to damage location, since they are based on a global response of the structure; (ii) presence of noise in the measurement of vibration responses; (iii) inadequate use of existing algorithms for damage feature extraction because of neglecting the non-linearity and non-stationarity of the recorded signals; (iv) environmental and operational conditions can also generate false damage detections in bridges; (v) the drawbacks of traditional algorithms for processing large amounts of data obtained from the BHM. This thesis proposes new vibration-based parameters and methods with focus on damage detection, localization and quantification, considering a mixed robust methodology that includes signal processing and machine learning methods to solve the identified problems. The increasing volume of bridge monitoring data makes it interesting to study the ability of advanced tools and systems to extract useful information from dynamic and static variables. In the field of Machine Learning (ML) and Artificial Intelligence (AI), powerful algorithms have been developed to face problems where the amount of data is much larger (big data). The possibilities of ML techniques (unsupervised algorithms) were analyzed here in bridges taking into account both operational and environmental conditions. A critical literature review was performed and a deep study of the accuracy and performance of a set of algorithms for detecting damage in three real bridges and one numerical model. In the literature review inherent to the vibration-based damage detection, several state-of-the-art methods have been studied that do not consider the nature of the data and the characteristics of the applied excitation (possible non-linearity, non-stationarity, presence or absence of environmental and/or operational effects) and the noise level of the sensors. Besides, most research uses modal-based damage characteristics that have some limitations. A poor data normalization is performed by the majority of methods and both operational and environmental variability is not properly accounted for. Likewise, the huge amount of data recorded requires automatic procedures with proven capacity to reduce the possibility of false alarms. On the other hand, many investigations have limitations since only numerical or laboratory cases are studied. Therefore, a methodology is proposed by the combination of several algorithms to avoid them. The conclusions show a robust methodology based on ML algorithms capable to detect, localize and quantify damage. It allows the engineers to verify bridges and anticipate significant structural damage when occurs. Moreover, the proposed non-modal parameters show their feasibility as damage features using ambient and forced vibrations. Hilbert-Huang Transform (HHT) in conjunction with Marginal Hilbert Spectrum and Instantaneous Phase Difference shows a great capability to analyze the nonlinear and nonstationary response signals for damage identification under operational conditions. The proposed strategy combines algorithms for signal processing (ICEEMDAN and HHT) and ML (k-means) to conduct damage detection and localization in bridges by using the traffic-induced vibration data in real-time operation.En los últimos años la comunidad científica ha desarrollado nuevas técnicas en monitoreo de salud estructural (SHM) para identificar los daños en estructuras civiles especialmente en puentes. Los sistemas de monitoreo de puentes (BHM) sirven para reducir los costos generales de mantenimiento del ciclo de vida, ya que su principal objetivo es prevenir daños y fallas catastróficas. En el BHM que utiliza datos dinámicos, existen varios problemas relacionados con el procesamiento posterior de las señales de vibración, tales como: (i) cuando se utilizan características dinámicas modales como frecuencias naturales, formas de modos y amortiguamiento, presentan una limitación en relación con la localización del daño, ya que se basan en una respuesta global de la estructura; (ii) presencia de ruido en la medición de las respuestas de vibración; (iii) uso inadecuado de los algoritmos existentes para la extracción de características de daño debido a la no linealidad y la no estacionariedad de las señales registradas; (iv) las condiciones ambientales y operativas también pueden generar falsas detecciones de daños en los puentes; (v) los inconvenientes de los algoritmos tradicionales para procesar grandes cantidades de datos obtenidos del BHM. Esta tesis propone nuevos parámetros y métodos basados en vibraciones con enfoque en la detección, localización y cuantificación de daños, considerando una metodología robusta que incluye métodos de procesamiento de señales y aprendizaje automático. El creciente volumen de datos de monitoreo de puentes hace que sea interesante estudiar la capacidad de herramientas y sistemas avanzados para extraer información útil de variables dinámicas y estáticas. En el campo del Machine Learning (ML) y la Inteligencia Artificial (IA) se han desarrollado potentes algoritmos para afrontar problemas donde la cantidad de datos es mucho mayor (big data). Aquí se analizaron las posibilidades de las técnicas ML (algoritmos no supervisados) teniendo en cuenta tanto las condiciones operativas como ambientales. Se realizó una revisión crítica de la literatura y se llevó a cabo un estudio profundo de la precisión y el rendimiento de un conjunto de algoritmos para la detección de daños en tres puentes reales y un modelo numérico. En la revisión de literatura se han estudiado varios métodos que no consideran la naturaleza de los datos y las características de la excitación aplicada (posible no linealidad, no estacionariedad, presencia o ausencia de efectos ambientales y/u operativos) y el nivel de ruido de los sensores. Además, la mayoría de las investigaciones utilizan características de daño modales que tienen algunas limitaciones. Estos métodos realizan una normalización deficiente de los datos y no se tiene en cuenta la variabilidad operativa y ambiental. Asimismo, la gran cantidad de datos registrados requiere de procedimientos automáticos para reducir la posibilidad de falsas alarmas. Por otro lado, muchas investigaciones tienen limitaciones ya que solo se estudian casos numéricos o de laboratorio. Por ello, se propone una metodología mediante la combinación de varios algoritmos. Las conclusiones muestran una metodología robusta basada en algoritmos de ML capaces de detectar, localizar y cuantificar daños. Permite a los ingenieros verificar puentes y anticipar daños estructurales. Además, los parámetros no modales propuestos muestran su viabilidad como características de daño utilizando vibraciones ambientales y forzadas. La Transformada de Hilbert-Huang (HHT) junto con el Espectro Marginal de Hilbert y la Diferencia de Fase Instantánea muestran una gran capacidad para analizar las señales de respuesta no lineales y no estacionarias para la identificación de daños en condiciones operativas. La estrategia propuesta combina algoritmos para el procesamiento de señales (ICEEMDAN y HHT) y ML (k-means) para detectar y localizar daños en puentes mediante el uso de datos de vibraciones inducidas por el tráfico en tiempo real.Postprint (published version

    Adaptive Signal Decomposition Methods for Vibration Signals of Rotating Machinery

    Get PDF
    Vibration‐based condition monitoring and fault diagnosis are becoming more common in the industry to increase machine availability and reliability. Considerable research efforts have recently been directed towards the development of adaptive signal processing methods for fault diagnosis. Two adaptive signal decomposition methods, i.e. the empirical mode decomposition (EMD) and the local mean decomposition (LMD), are widely used. This chapter is intended to summarize the recent developments mostly based on the authors’ works. It aims to provide a valuable reference for readers on the processing and analysis of vibration signals collected from rotating machinery

    A Hybrid Approach Based on Variational Mode Decomposition for Analyzing and Predicting Urban Travel Speed

    Get PDF
    Predicting travel speeds on urban road networks is a challenging subject due to its uncertainty stemming from travel demand, geometric condition, traffic signals, and other exogenous factors. This uncertainty appears as nonlinearity, nonstationarity, and volatility in traffic data, and it also creates a spatiotemporal heterogeneity of link travel speed by interacting with neighbor links. In this study, we propose a hybrid model using variational mode decomposition (VMD) to investigate and mitigate the uncertainty of urban travel speeds. The VMD allows the travel speed data to be divided into orthogonal and oscillatory sub-signals, called modes. The regular components are extracted as the low-frequency modes, and the irregular components presenting uncertainty are transformed into a combination of modes, which is more predictable than the original uncertainty. For the prediction, the VMD decomposes the travel speed data into modes, and these modes are predicted and summed to represent the predicted travel speed. The evaluation results on urban road networks show that, the proposed hybrid model outperforms the benchmark models both in the congested and in the overall conditions. The improvement in performance increases significantly over specific link-days, which generally are hard to predict. To explain the significant variance of the prediction performance according to each link and each day, the correlation analysis between the properties of modes and the performance of the model are conducted. The results on correlation analysis show that the more variance of nondaily pattern is explained through the modes, the easier it was to predict the speed. Based on the results, discussions on the interpretation on the correlation analysis and future research are presented. Document type: Articl

    Automatic analysis and classification of cardiac acoustic signals for long term monitoring

    Get PDF
    Objective: Cardiovascular diseases are the leading cause of death worldwide resulting in over 17.9 million deaths each year. Most of these diseases are preventable and treatable, but their progression and outcomes are significantly more positive with early-stage diagnosis and proper disease management. Among the approaches available to assist with the task of early-stage diagnosis and management of cardiac conditions, automatic analysis of auscultatory recordings is one of the most promising ones, since it could be particularly suitable for ambulatory/wearable monitoring. Thus, proper investigation of abnormalities present in cardiac acoustic signals can provide vital clinical information to assist long term monitoring. Cardiac acoustic signals, however, are very susceptible to noise and artifacts, and their characteristics vary largely with the recording conditions which makes the analysis challenging. Additionally, there are challenges in the steps used for automatic analysis and classification of cardiac acoustic signals. Broadly, these steps are the segmentation, feature extraction and subsequent classification of recorded signals using selected features. This thesis presents approaches using novel features with the aim to assist the automatic early-stage detection of cardiovascular diseases with improved performance, using cardiac acoustic signals collected in real-world conditions. Methods: Cardiac auscultatory recordings were studied to identify potential features to help in the classification of recordings from subjects with and without cardiac diseases. The diseases considered in this study for the identification of the symptoms and characteristics are the valvular heart diseases due to stenosis and regurgitation, atrial fibrillation, and splitting of fundamental heart sounds leading to additional lub/dub sounds in the systole or diastole interval of a cardiac cycle. The localisation of cardiac sounds of interest was performed using an adaptive wavelet-based filtering in combination with the Shannon energy envelope and prior information of fundamental heart sounds. This is a prerequisite step for the feature extraction and subsequent classification of recordings, leading to a more precise diagnosis. Localised segments of S1 and S2 sounds, and artifacts, were used to extract a set of perceptual and statistical features using wavelet transform, homomorphic filtering, Hilbert transform and mel-scale filtering, which were then fed to train an ensemble classifier to interpret S1 and S2 sounds. Once sound peaks of interest were identified, features extracted from these peaks, together with the features used for the identification of S1 and S2 sounds, were used to develop an algorithm to classify recorded signals. Overall, 99 features were extracted and statistically analysed using neighborhood component analysis (NCA) to identify the features which showed the greatest ability in classifying recordings. Selected features were then fed to train an ensemble classifier to classify abnormal recordings, and hyperparameters were optimized to evaluate the performance of the trained classifier. Thus, a machine learning-based approach for the automatic identification and classification of S1 and S2, and normal and abnormal recordings, in real-world noisy recordings using a novel feature set is presented. The validity of the proposed algorithm was tested using acoustic signals recorded in real-world, non-controlled environments at four auscultation sites (aortic valve, tricuspid valve, mitral valve, and pulmonary valve), from the subjects with and without cardiac diseases; together with recordings from the three large public databases. The performance metrics of the methodology in relation to classification accuracy (CA), sensitivity (SE), precision (P+), and F1 score, were evaluated. Results: This thesis proposes four different algorithms to automatically classify fundamental heart sounds – S1 and S2; normal fundamental sounds and abnormal additional lub/dub sounds recordings; normal and abnormal recordings; and recordings with heart valve disorders, namely the mitral stenosis (MS), mitral regurgitation (MR), mitral valve prolapse (MVP), aortic stenosis (AS) and murmurs, using cardiac acoustic signals. The results obtained from these algorithms were as follows: • The algorithm to classify S1 and S2 sounds achieved an average SE of 91.59% and 89.78%, and F1 score of 90.65% and 89.42%, in classifying S1 and S2, respectively. 87 features were extracted and statistically studied to identify the top 14 features which showed the best capabilities in classifying S1 and S2, and artifacts. The analysis showed that the most relevant features were those extracted using Maximum Overlap Discrete Wavelet Transform (MODWT) and Hilbert transform. • The algorithm to classify normal fundamental heart sounds and abnormal additional lub/dub sounds in the systole or diastole intervals of a cardiac cycle, achieved an average SE of 89.15%, P+ of 89.71%, F1 of 89.41%, and CA of 95.11% using the test dataset from the PASCAL database. The top 10 features that achieved the highest weights in classifying these recordings were also identified. • Normal and abnormal classification of recordings using the proposed algorithm achieved a mean CA of 94.172%, and SE of 92.38%, in classifying recordings from the different databases. Among the top 10 acoustic features identified, the deterministic energy of the sound peaks of interest and the instantaneous frequency extracted using the Hilbert Huang-transform, achieved the highest weights. • The machine learning-based approach proposed to classify recordings of heart valve disorders (AS, MS, MR, and MVP) achieved an average CA of 98.26% and SE of 95.83%. 99 acoustic features were extracted and their abilities to differentiate these abnormalities were examined using weights obtained from the neighborhood component analysis (NCA). The top 10 features which showed the greatest abilities in classifying these abnormalities using recordings from the different databases were also identified. The achieved results demonstrate the ability of the algorithms to automatically identify and classify cardiac sounds. This work provides the basis for measurements of many useful clinical attributes of cardiac acoustic signals and can potentially help in monitoring the overall cardiac health for longer duration. The work presented in this thesis is the first-of-its-kind to validate the results using both, normal and pathological cardiac acoustic signals, recorded for a long continuous duration of 5 minutes at four different auscultation sites in non-controlled real-world conditions.Open Acces

    Hybrid Advanced Optimization Methods with Evolutionary Computation Techniques in Energy Forecasting

    Get PDF
    More accurate and precise energy demand forecasts are required when energy decisions are made in a competitive environment. Particularly in the Big Data era, forecasting models are always based on a complex function combination, and energy data are always complicated. Examples include seasonality, cyclicity, fluctuation, dynamic nonlinearity, and so on. These forecasting models have resulted in an over-reliance on the use of informal judgment and higher expenses when lacking the ability to determine data characteristics and patterns. The hybridization of optimization methods and superior evolutionary algorithms can provide important improvements via good parameter determinations in the optimization process, which is of great assistance to actions taken by energy decision-makers. This book aimed to attract researchers with an interest in the research areas described above. Specifically, it sought contributions to the development of any hybrid optimization methods (e.g., quadratic programming techniques, chaotic mapping, fuzzy inference theory, quantum computing, etc.) with advanced algorithms (e.g., genetic algorithms, ant colony optimization, particle swarm optimization algorithm, etc.) that have superior capabilities over the traditional optimization approaches to overcome some embedded drawbacks, and the application of these advanced hybrid approaches to significantly improve forecasting accuracy

    Signal Processing Using Non-invasive Physiological Sensors

    Get PDF
    Non-invasive biomedical sensors for monitoring physiological parameters from the human body for potential future therapies and healthcare solutions. Today, a critical factor in providing a cost-effective healthcare system is improving patients' quality of life and mobility, which can be achieved by developing non-invasive sensor systems, which can then be deployed in point of care, used at home or integrated into wearable devices for long-term data collection. Another factor that plays an integral part in a cost-effective healthcare system is the signal processing of the data recorded with non-invasive biomedical sensors. In this book, we aimed to attract researchers who are interested in the application of signal processing methods to different biomedical signals, such as an electroencephalogram (EEG), electromyogram (EMG), functional near-infrared spectroscopy (fNIRS), electrocardiogram (ECG), galvanic skin response, pulse oximetry, photoplethysmogram (PPG), etc. We encouraged new signal processing methods or the use of existing signal processing methods for its novel application in physiological signals to help healthcare providers make better decisions

    Automated mass spectrometry-based metabolomics data processing by blind source separation methods

    Get PDF
    Una de les principals limitacions de la metabolòmica és la transformació de dades crues en informació biològica. A més, la metabolòmica basada en espectrometria de masses genera grans quantitats de dades complexes caracteritzades per la co-elució de compostos i artefactes experimentals. L'objectiu d'aquesta tesi és desenvolupar estratègies automatitzades basades en deconvolució cega del senyal per millorar les capacitats dels mètodes existents que tracten les limitacions de les diferents passes del processament de dades en metabolòmica. L'objectiu d'aquesta tesi és també desenvolupar eines capaces d'executar el flux de treball del processament de dades en metabolòmica, que inclou el preprocessament de dades, deconvolució espectral, alineament i identificació. Com a resultat, tres nous mètodes automàtics per deconvolució espectral basats en deconvolució cega del senyal van ser desenvolupats. Aquests mètodes van ser inclosos en dues eines computacionals que permeten convertir automàticament dades crues en informació biològica interpretable i per tant, permeten resoldre hipòtesis biològiques i adquirir nous coneixements biològics.Una de les principals limitacions de la metabolòmica és la transformació de dades crues en informació biològica. A més, la metabolòmica basada en espectrometria de masses genera grans quantitats de dades complexes caracteritzades per la co-elució de compostos i artefactes experimentals. L'objectiu d'aquesta tesi és desenvolupar estratègies automatitzades basades en deconvolució cega del senyal per millorar les capacitats dels mètodes existents que tracten les limitacions de les diferents passes del processament de dades en metabolòmica. L'objectiu d'aquesta tesi és també desenvolupar eines capaces d'executar el flux de treball del processament de dades en metabolòmica, que inclou el preprocessament de dades, deconvolució espectral, alineament i identificació. Com a resultat, tres nous mètodes automàtics per deconvolució espectral basats en deconvolució cega del senyal van ser desenvolupats. Aquests mètodes van ser inclosos en dues eines computacionals que permeten convertir automàticament dades crues en informació biològica interpretable i per tant, permeten resoldre hipòtesis biològiques i adquirir nous coneixements biològics.Una de las principales limitaciones de la metabolómica es la transformación de datos crudos en información biológica. Además, la metabolómica basada en espectrometría de masas genera grandes cantidades de datos complejos caracterizados por la co-elución de compuestos y artefactos experimentales. El objetivo de esta tesis es desarrollar estrategias automatizadas basadas en deconvolución ciega de la señal para mejorar las capacidades de los métodos existentes que tratan las limitaciones de los diferentes pasos del procesamiento de datos en metabolómica. El objetivo de esta tesis es también desarrollar herramientas capaces de ejecutar el flujo de trabajo del procesamiento de datos en metabolómica, que incluye el preprocessamiento de datos, deconvolución espectral, alineamiento e identificación. Como resultado, tres nuevos métodos automáticos para deconvolución espectral basados en deconvolución ciega de la señal fueron desarrollados. Estos métodos fueron incluidos en dos herramientas computacionales que permiten convertir automáticamente datos crudos en información biológica interpretable y por lo tanto, permiten resolver hipótesis biológicas y adquirir nuevos conocimientos biológicos.One of the major bottlenecks in metabolomics is to convert raw data samples into biological interpretable information. Moreover, mass spectrometry-based metabolomics generates large and complex datasets characterized by co-eluting compounds and with experimental artifacts. This thesis main objective is to develop automated strategies based on blind source separation to improve the capabilities of the current methods that tackle the different metabolomics data processing workflow steps limitations. Also, the objective of this thesis is to develop tools capable of performing the entire metabolomics workflow for GC--MS, including pre-processing, spectral deconvolution, alignment and identification. As a result, three new automated methods for spectral deconvolution based on blind source separation were developed. These methods were embedded into two computation tools able to automatedly convert raw data into biological interpretable information and thus, allow resolving biological answers and discovering new biological insights
    corecore