19,068 research outputs found
Explaining Aviation Safety Incidents Using Deep Temporal Multiple Instance Learning
Although aviation accidents are rare, safety incidents occur more frequently
and require a careful analysis to detect and mitigate risks in a timely manner.
Analyzing safety incidents using operational data and producing event-based
explanations is invaluable to airline companies as well as to governing
organizations such as the Federal Aviation Administration (FAA) in the United
States. However, this task is challenging because of the complexity involved in
mining multi-dimensional heterogeneous time series data, the lack of
time-step-wise annotation of events in a flight, and the lack of scalable tools
to perform analysis over a large number of events. In this work, we propose a
precursor mining algorithm that identifies events in the multidimensional time
series that are correlated with the safety incident. Precursors are valuable to
systems health and safety monitoring and in explaining and forecasting safety
incidents. Current methods suffer from poor scalability to high dimensional
time series data and are inefficient in capturing temporal behavior. We propose
an approach by combining multiple-instance learning (MIL) and deep recurrent
neural networks (DRNN) to take advantage of MIL's ability to learn using weakly
supervised data and DRNN's ability to model temporal behavior. We describe the
algorithm, the data, the intuition behind taking a MIL approach, and a
comparative analysis of the proposed algorithm with baseline models. We also
discuss the application to a real-world aviation safety problem using data from
a commercial airline company and discuss the model's abilities and
shortcomings, with some final remarks about possible deployment directions
Exploring Interpretable LSTM Neural Networks over Multi-Variable Data
For recurrent neural networks trained on time series with target and
exogenous variables, in addition to accurate prediction, it is also desired to
provide interpretable insights into the data. In this paper, we explore the
structure of LSTM recurrent neural networks to learn variable-wise hidden
states, with the aim to capture different dynamics in multi-variable time
series and distinguish the contribution of variables to the prediction. With
these variable-wise hidden states, a mixture attention mechanism is proposed to
model the generative process of the target. Then we develop associated training
methods to jointly learn network parameters, variable and temporal importance
w.r.t the prediction of the target variable. Extensive experiments on real
datasets demonstrate enhanced prediction performance by capturing the dynamics
of different variables. Meanwhile, we evaluate the interpretation results both
qualitatively and quantitatively. It exhibits the prospect as an end-to-end
framework for both forecasting and knowledge extraction over multi-variable
data.Comment: Accepted to International Conference on Machine Learning (ICML), 201
A Neural Stochastic Volatility Model
In this paper, we show that the recent integration of statistical models with
deep recurrent neural networks provides a new way of formulating volatility
(the degree of variation of time series) models that have been widely used in
time series analysis and prediction in finance. The model comprises a pair of
complementary stochastic recurrent neural networks: the generative network
models the joint distribution of the stochastic volatility process; the
inference network approximates the conditional distribution of the latent
variables given the observables. Our focus here is on the formulation of
temporal dynamics of volatility over time under a stochastic recurrent neural
network framework. Experiments on real-world stock price datasets demonstrate
that the proposed model generates a better volatility estimation and prediction
that outperforms mainstream methods, e.g., deterministic models such as GARCH
and its variants, and stochastic models namely the MCMC-based model
\emph{stochvol} as well as the Gaussian process volatility model \emph{GPVol},
on average negative log-likelihood
- …
