3,869 research outputs found
Training Echo State Networks with Regularization through Dimensionality Reduction
In this paper we introduce a new framework to train an Echo State Network to
predict real valued time-series. The method consists in projecting the output
of the internal layer of the network on a space with lower dimensionality,
before training the output layer to learn the target task. Notably, we enforce
a regularization constraint that leads to better generalization capabilities.
We evaluate the performances of our approach on several benchmark tests, using
different techniques to train the readout of the network, achieving superior
predictive performance when using the proposed framework. Finally, we provide
an insight on the effectiveness of the implemented mechanics through a
visualization of the trajectory in the phase space and relying on the
methodologies of nonlinear time-series analysis. By applying our method on well
known chaotic systems, we provide evidence that the lower dimensional embedding
retains the dynamical properties of the underlying system better than the
full-dimensional internal states of the network
Deep learning for time series classification: a review
Time Series Classification (TSC) is an important and challenging problem in
data mining. With the increase of time series data availability, hundreds of
TSC algorithms have been proposed. Among these methods, only a few have
considered Deep Neural Networks (DNNs) to perform this task. This is surprising
as deep learning has seen very successful applications in the last years. DNNs
have indeed revolutionized the field of computer vision especially with the
advent of novel deeper architectures such as Residual and Convolutional Neural
Networks. Apart from images, sequential data such as text and audio can also be
processed with DNNs to reach state-of-the-art performance for document
classification and speech recognition. In this article, we study the current
state-of-the-art performance of deep learning algorithms for TSC by presenting
an empirical study of the most recent DNN architectures for TSC. We give an
overview of the most successful deep learning applications in various time
series domains under a unified taxonomy of DNNs for TSC. We also provide an
open source deep learning framework to the TSC community where we implemented
each of the compared approaches and evaluated them on a univariate TSC
benchmark (the UCR/UEA archive) and 12 multivariate time series datasets. By
training 8,730 deep learning models on 97 time series datasets, we propose the
most exhaustive study of DNNs for TSC to date.Comment: Accepted at Data Mining and Knowledge Discover
Time series kernel similarities for predicting Paroxysmal Atrial Fibrillation from ECGs
We tackle the problem of classifying Electrocardiography (ECG) signals with
the aim of predicting the onset of Paroxysmal Atrial Fibrillation (PAF). Atrial
fibrillation is the most common type of arrhythmia, but in many cases PAF
episodes are asymptomatic. Therefore, in order to help diagnosing PAF, it is
important to design procedures for detecting and, more importantly, predicting
PAF episodes. We propose a method for predicting PAF events whose first step
consists of a feature extraction procedure that represents each ECG as a
multi-variate time series. Successively, we design a classification framework
based on kernel similarities for multi-variate time series, capable of handling
missing data. We consider different approaches to perform classification in the
original space of the multi-variate time series and in an embedding space,
defined by the kernel similarity measure. We achieve a classification accuracy
comparable with state of the art methods, with the additional advantage of
detecting the PAF onset up to 15 minutes in advance
Time Series Cluster Kernel for Learning Similarities between Multivariate Time Series with Missing Data
Similarity-based approaches represent a promising direction for time series
analysis. However, many such methods rely on parameter tuning, and some have
shortcomings if the time series are multivariate (MTS), due to dependencies
between attributes, or the time series contain missing data. In this paper, we
address these challenges within the powerful context of kernel methods by
proposing the robust \emph{time series cluster kernel} (TCK). The approach
taken leverages the missing data handling properties of Gaussian mixture models
(GMM) augmented with informative prior distributions. An ensemble learning
approach is exploited to ensure robustness to parameters by combining the
clustering results of many GMM to form the final kernel.
We evaluate the TCK on synthetic and real data and compare to other
state-of-the-art techniques. The experimental results demonstrate that the TCK
is robust to parameter choices, provides competitive results for MTS without
missing data and outstanding results for missing data.Comment: 23 pages, 6 figure
Development of a fusion adaptive algorithm for marine debris detection within the post-Sandy restoration framework
Recognition of marine debris represent a difficult task due to the extreme variability of the marine environment, the possible targets, and the variable skill levels of human operators. The range of potential targets is much wider than similar fields of research such as mine hunting, localization of unexploded ordnance or pipeline detection. In order to address this additional complexity, an adaptive algorithm is being developing that appropriately responds to changes in the environment, and context.
The preliminary step is to properly geometrically and radiometrically correct the collected data. Then, the core engine manages the fusion of a set of statistically- and physically-based algorithms, working at different levels (swath, beam, snippet, and pixel) and using both predictive modeling (that is, a high-frequency acoustic backscatter model) and phenomenological (e.g., digital image processing techniques) approaches. The expected outcome is the reduction of inter-algorithmic cross-correlation and, thus, the probability of false alarm. At this early stage, we provide a proof of concept showing outcomes from algorithms that dynamically adapt themselves to the depth and average backscatter level met in the surveyed environment, targeting marine debris (modeled as objects of about 1-m size).
The project relies on a modular software library, called Matador (Marine Target Detection and Object Recognition)
- …