9,736 research outputs found
Shape and Time Distortion Loss for Training Deep Time Series Forecasting Models
International audienceThis paper addresses the problem of time series forecasting for non-stationarysignals and multiple future steps prediction. To handle this challenging task, weintroduce DILATE (DIstortion Loss including shApe and TimE), a new objectivefunction for training deep neural networks. DILATE aims at accurately predictingsudden changes, and explicitly incorporates two terms supporting precise shapeand temporal change detection. We introduce a differentiable loss function suitablefor training deep neural nets, and provide a custom back-prop implementation forspeeding up optimization. We also introduce a variant of DILATE, which providesa smooth generalization of temporally-constrained Dynamic Time Warping (DTW).Experiments carried out on various non-stationary datasets reveal the very goodbehaviour of DILATE compared to models trained with the standard Mean SquaredError (MSE) loss function, and also to DTW and variants. DILATE is also agnosticto the choice of the model, and we highlight its benefit for training fully connectednetworks as well as specialized recurrent architectures, showing its capacity toimprove over state-of-the-art trajectory forecasting approaches
TILDE-Q: A Transformation Invariant Loss Function for Time-Series Forecasting
Time-series forecasting has caught increasing attention in the AI research
field due to its importance in solving real-world problems across different
domains, such as energy, weather, traffic, and economy. As shown in various
types of data, it has been a must-see issue to deal with drastic changes,
temporal patterns, and shapes in sequential data that previous models are weak
in prediction. This is because most cases in time-series forecasting aim to
minimize norm distances as loss functions, such as mean absolute error
(MAE) or mean square error (MSE). These loss functions are vulnerable to not
only considering temporal dynamics modeling but also capturing the shape of
signals. In addition, these functions often make models misbehave and return
uncorrelated results to the original time-series. To become an effective loss
function, it has to be invariant to the set of distortions between two
time-series data instead of just comparing exact values. In this paper, we
propose a novel loss function, called TILDE-Q (Transformation Invariant Loss
function with Distance EQuilibrium), that not only considers the distortions in
amplitude and phase but also allows models to capture the shape of time-series
sequences. In addition, TILDE-Q supports modeling periodic and non-periodic
temporal dynamics at the same time. We evaluate the effectiveness of TILDE-Q by
conducting extensive experiments with respect to periodic and non-periodic
conditions of data, from naive models to state-of-the-art models. The
experiment results indicate that the models trained with TILDE-Q outperform
those trained with other training metrics (e.g., MSE, dynamic time warping
(DTW), temporal distortion index (TDI), and longest common subsequence (LCSS)).Comment: 9 pages paper, 2 pages references, and 7 pages appendix. Submitted as
conference paper to ICLR 202
Epidemiological Prediction using Deep Learning
Department of Mathematical SciencesAccurate and real-time epidemic disease prediction plays a significant role in the health system and is of great importance for policy making, vaccine distribution and disease control. From the SIR model by Mckendrick and Kermack in the early 1900s, researchers have developed a various mathematical model to forecast the spread of disease. With all attempt, however, the epidemic prediction has always been an ongoing scientific issue due to the limitation that the current model lacks flexibility or shows poor performance. Owing to the temporal and spatial aspect of epidemiological data, the problem fits into the category of time-series forecasting. To capture both aspects of the data, this paper proposes a combination of recent Deep Leaning
models and applies the model to ILI (influenza like illness) data in the United States. Specifically, the graph convolutional network (GCN) model is used to capture the geographical feature of the U.S. regions and the gated recurrent unit (GRU) model is used to capture the temporal dynamics of ILI. The result was compared with the Deep Learning model proposed by other researchers, demonstrating the proposed model outperforms the previous methods.clos
Recommended from our members
On-line part deformation prediction based on deep learning
Deformation prediction is the basis of deformation control in manufacturing process planning. This paper presents an on-line part deformation prediction method using a deep learning model during numerical control machining process, which is different from traditional methods based on finite element simulation of stress release prior to the actual machining process. A fourth-order tensor model is proposed to represent the continuous part geometric information, process information, and monitoring information, which is used as the input to the deep learning model. A deep learning framework with a Conventional Neural Network and a Recurrent Neural Network has been constructed and trained by monitored deformation data and process information associated with interim part geometric information. The proposed method can be generalised for different parts with certain similarities and has the potential to provide a reference for an adaptive machining control strategy for reducing part deformation. The proposed method was validated by actual machining experiments, and the results show that the prediction accuracy has been improved compared with existing methods. Furthermore, this paper shifts the difficult problem of residual stress measurement and off-line deformation prediction to the solution of on-line deformation prediction based on deformation monitoring data
- …