39 research outputs found
Neural activity classification with machine learning models trained on interspike interval series data
The flow of information through the brain is reflected by the activity
patterns of neural cells. Indeed, these firing patterns are widely used as
input data to predictive models that relate stimuli and animal behavior to the
activity of a population of neurons. However, relatively little attention was
paid to single neuron spike trains as predictors of cell or network properties
in the brain. In this work, we introduce an approach to neuronal spike train
data mining which enables effective classification and clustering of neuron
types and network activity states based on single-cell spiking patterns. This
approach is centered around applying state-of-the-art time series
classification/clustering methods to sequences of interspike intervals recorded
from single neurons. We demonstrate good performance of these methods in tasks
involving classification of neuron type (e.g. excitatory vs. inhibitory cells)
and/or neural circuit activity state (e.g. awake vs. REM sleep vs. nonREM sleep
states) on an open-access cortical spiking activity dataset
Time Series Classification Using Images: The Case Of SAX-Like Transformation
This study concerns the classification of univariate time series. The essence of the survey is transforming time series into two-dimensional monochromatic images. Then, obtained images are classified using convolutional neural networks. Transformation of time series to images is performed in two steps. First, a time series is turned into a string of symbols from an assumed alphabet utilizing SAX-like transformation. The length of the string is supposed to be the square of a natural number. Second, the string of symbols is turned into a square matrix of size equal to the square root of the length of the string representing the time series. Then, each symbol of the matrix is turned into a square-shaped piece of pixels of a grey level determined by the symbol. So then, this operation results in an image (still of square shape) composed of squares of grey pixels. Finally, convolutional neural networks are employed to classify such images. An overall design process is presented with a focus on investigating time series-to-image two-step transformations. Experimental studies involving publicly available data sets are reported, along with an adequate comparative analyses
Timage -- A Robust Time Series Classification Pipeline
Time series are series of values ordered by time. This kind of data can be
found in many real world settings. Classifying time series is a difficult task
and an active area of research. This paper investigates the use of transfer
learning in Deep Neural Networks and a 2D representation of time series known
as Recurrence Plots. In order to utilize the research done in the area of image
classification, where Deep Neural Networks have achieved very good results, we
use a Residual Neural Networks architecture known as ResNet. As preprocessing
of time series is a major part of every time series classification pipeline,
the method proposed simplifies this step and requires only few parameters. For
the first time we propose a method for multi time series classification:
Training a single network to classify all datasets in the archive with one
network. We are among the first to evaluate the method on the latest 2018
release of the UCR archive, a well established time series classification
benchmarking dataset.Comment: ICANN19, 28th International Conference on Artificial Neural Network
Rethinking 1D-CNN for Time Series Classification: A Stronger Baseline
For time series classification task using 1D-CNN, the selection of kernel
size is critically important to ensure the model can capture the right scale
salient signal from a long time-series. Most of the existing work on 1D-CNN
treats the kernel size as a hyper-parameter and tries to find the proper kernel
size through a grid search which is time-consuming and is inefficient. This
paper theoretically analyses how kernel size impacts the performance of 1D-CNN.
Considering the importance of kernel size, we propose a novel Omni-Scale 1D-CNN
(OS-CNN) architecture to capture the proper kernel size during the model
learning period. A specific design for kernel size configuration is developed
which enables us to assemble very few kernel-size options to represent more
receptive fields. The proposed OS-CNN method is evaluated using the UCR archive
with 85 datasets. The experiment results demonstrate that our method is a
stronger baseline in multiple performance indicators, including the critical
difference diagram, counts of wins, and average accuracy. We also published the
experimental source codes at GitHub (https://github.com/Wensi-Tang/OS-CNN/)
Benchmarking Multivariate Time Series Classification Algorithms
Time Series Classification (TSC) involved building predictive models for a
discrete target variable from ordered, real valued, attributes. Over recent
years, a new set of TSC algorithms have been developed which have made
significant improvement over the previous state of the art. The main focus has
been on univariate TSC, i.e. the problem where each case has a single series
and a class label. In reality, it is more common to encounter multivariate TSC
(MTSC) problems where multiple series are associated with a single label.
Despite this, much less consideration has been given to MTSC than the
univariate case. The UEA archive of 30 MTSC problems released in 2018 has made
comparison of algorithms easier. We review recently proposed bespoke MTSC
algorithms based on deep learning, shapelets and bag of words approaches. The
simplest approach to MTSC is to ensemble univariate classifiers over the
multivariate dimensions. We compare the bespoke algorithms to these dimension
independent approaches on the 26 of the 30 MTSC archive problems where the data
are all of equal length. We demonstrate that the independent ensemble of
HIVE-COTE classifiers is the most accurate, but that, unlike with univariate
classification, dynamic time warping is still competitive at MTSC.Comment: Data Min Knowl Disc (2020