17 research outputs found
Attend and Diagnose: Clinical Time Series Analysis using Attention Models
With widespread adoption of electronic health records, there is an increased
emphasis for predictive models that can effectively deal with clinical
time-series data. Powered by Recurrent Neural Network (RNN) architectures with
Long Short-Term Memory (LSTM) units, deep neural networks have achieved
state-of-the-art results in several clinical prediction tasks. Despite the
success of RNNs, its sequential nature prohibits parallelized computing, thus
making it inefficient particularly when processing long sequences. Recently,
architectures which are based solely on attention mechanisms have shown
remarkable success in transduction tasks in NLP, while being computationally
superior. In this paper, for the first time, we utilize attention models for
clinical time-series modeling, thereby dispensing recurrence entirely. We
develop the \textit{SAnD} (Simply Attend and Diagnose) architecture, which
employs a masked, self-attention mechanism, and uses positional encoding and
dense interpolation strategies for incorporating temporal order. Furthermore,
we develop a multi-task variant of \textit{SAnD} to jointly infer models with
multiple diagnosis tasks. Using the recent MIMIC-III benchmark datasets, we
demonstrate that the proposed approach achieves state-of-the-art performance in
all tasks, outperforming LSTM models and classical baselines with
hand-engineered features.Comment: AAAI 201