52,331 research outputs found
Deep Learning with Long Short-Term Memory for Time Series Prediction
Time series prediction can be generalized as a process that extracts useful
information from historical records and then determines future values. Learning
long-range dependencies that are embedded in time series is often an obstacle
for most algorithms, whereas Long Short-Term Memory (LSTM) solutions, as a
specific kind of scheme in deep learning, promise to effectively overcome the
problem. In this article, we first give a brief introduction to the structure
and forward propagation mechanism of the LSTM model. Then, aiming at reducing
the considerable computing cost of LSTM, we put forward the Random Connectivity
LSTM (RCLSTM) model and test it by predicting traffic and user mobility in
telecommunication networks. Compared to LSTM, RCLSTM is formed via stochastic
connectivity between neurons, which achieves a significant breakthrough in the
architecture formation of neural networks. In this way, the RCLSTM model
exhibits a certain level of sparsity, which leads to an appealing decrease in
the computational complexity and makes the RCLSTM model become more applicable
in latency-stringent application scenarios. In the field of telecommunication
networks, the prediction of traffic series and mobility traces could directly
benefit from this improvement as we further demonstrate that the prediction
accuracy of RCLSTM is comparable to that of the conventional LSTM no matter how
we change the number of training samples or the length of input sequences.Comment: 9 pages, 5 figures, 14 reference
Traffic Prediction Based on Random Connectivity in Deep Learning with Long Short-Term Memory
Traffic prediction plays an important role in evaluating the performance of
telecommunication networks and attracts intense research interests. A
significant number of algorithms and models have been put forward to analyse
traffic data and make prediction. In the recent big data era, deep learning has
been exploited to mine the profound information hidden in the data. In
particular, Long Short-Term Memory (LSTM), one kind of Recurrent Neural Network
(RNN) schemes, has attracted a lot of attentions due to its capability of
processing the long-range dependency embedded in the sequential traffic data.
However, LSTM has considerable computational cost, which can not be tolerated
in tasks with stringent latency requirement. In this paper, we propose a deep
learning model based on LSTM, called Random Connectivity LSTM (RCLSTM).
Compared to the conventional LSTM, RCLSTM makes a notable breakthrough in the
formation of neural network, which is that the neurons are connected in a
stochastic manner rather than full connected. So, the RCLSTM, with certain
intrinsic sparsity, have many neural connections absent (distinguished from the
full connectivity) and which leads to the reduction of the parameters to be
trained and the computational cost. We apply the RCLSTM to predict traffic and
validate that the RCLSTM with even 35% neural connectivity still shows a
satisfactory performance. When we gradually add training samples, the
performance of RCLSTM becomes increasingly closer to the baseline LSTM.
Moreover, for the input traffic sequences of enough length, the RCLSTM exhibits
even superior prediction accuracy than the baseline LSTM.Comment: 6 pages, 9 figure
A Long Short-Term Memory Recurrent Neural Network Framework for Network Traffic Matrix Prediction
Network Traffic Matrix (TM) prediction is defined as the problem of
estimating future network traffic from the previous and achieved network
traffic data. It is widely used in network planning, resource management and
network security. Long Short-Term Memory (LSTM) is a specific recurrent neural
network (RNN) architecture that is well-suited to learn from experience to
classify, process and predict time series with time lags of unknown size. LSTMs
have been shown to model temporal sequences and their long-range dependencies
more accurately than conventional RNNs. In this paper, we propose a LSTM RNN
framework for predicting short and long term Traffic Matrix (TM) in large
networks. By validating our framework on real-world data from GEANT network, we
show that our LSTM models converge quickly and give state of the art TM
prediction performance for relatively small sized models.Comment: Submitted for peer review. arXiv admin note: text overlap with
arXiv:1402.1128 by other author
NeuTM: A Neural Network-based Framework for Traffic Matrix Prediction in SDN
This paper presents NeuTM, a framework for network Traffic Matrix (TM)
prediction based on Long Short-Term Memory Recurrent Neural Networks (LSTM
RNNs). TM prediction is defined as the problem of estimating future network
traffic matrix from the previous and achieved network traffic data. It is
widely used in network planning, resource management and network security. Long
Short-Term Memory (LSTM) is a specific recurrent neural network (RNN)
architecture that is well-suited to learn from data and classify or predict
time series with time lags of unknown size. LSTMs have been shown to model
long-range dependencies more accurately than conventional RNNs. NeuTM is a LSTM
RNN-based framework for predicting TM in large networks. By validating our
framework on real-world data from GEEANT network, we show that our model
converges quickly and gives state of the art TM prediction performance.Comment: Submitted to NOMS18. arXiv admin note: substantial text overlap with
arXiv:1705.0569
Heteroscedastic Gaussian processes for uncertainty modeling in large-scale crowdsourced traffic data
Accurately modeling traffic speeds is a fundamental part of efficient
intelligent transportation systems. Nowadays, with the widespread deployment of
GPS-enabled devices, it has become possible to crowdsource the collection of
speed information to road users (e.g. through mobile applications or dedicated
in-vehicle devices). Despite its rather wide spatial coverage, crowdsourced
speed data also brings very important challenges, such as the highly variable
measurement noise in the data due to a variety of driving behaviors and sample
sizes. When not properly accounted for, this noise can severely compromise any
application that relies on accurate traffic data. In this article, we propose
the use of heteroscedastic Gaussian processes (HGP) to model the time-varying
uncertainty in large-scale crowdsourced traffic data. Furthermore, we develop a
HGP conditioned on sample size and traffic regime (SRC-HGP), which makes use of
sample size information (probe vehicles per minute) as well as previous
observed speeds, in order to more accurately model the uncertainty in observed
speeds. Using 6 months of crowdsourced traffic data from Copenhagen, we
empirically show that the proposed heteroscedastic models produce significantly
better predictive distributions when compared to current state-of-the-art
methods for both speed imputation and short-term forecasting tasks.Comment: 22 pages, Transportation Research Part C: Emerging Technologies
(Elsevier
Short-Term Forecasting of Passenger Demand under On-Demand Ride Services: A Spatio-Temporal Deep Learning Approach
Short-term passenger demand forecasting is of great importance to the
on-demand ride service platform, which can incentivize vacant cars moving from
over-supply regions to over-demand regions. The spatial dependences, temporal
dependences, and exogenous dependences need to be considered simultaneously,
however, which makes short-term passenger demand forecasting challenging. We
propose a novel deep learning (DL) approach, named the fusion convolutional
long short-term memory network (FCL-Net), to address these three dependences
within one end-to-end learning architecture. The model is stacked and fused by
multiple convolutional long short-term memory (LSTM) layers, standard LSTM
layers, and convolutional layers. The fusion of convolutional techniques and
the LSTM network enables the proposed DL approach to better capture the
spatio-temporal characteristics and correlations of explanatory variables. A
tailored spatially aggregated random forest is employed to rank the importance
of the explanatory variables. The ranking is then used for feature selection.
The proposed DL approach is applied to the short-term forecasting of passenger
demand under an on-demand ride service platform in Hangzhou, China.
Experimental results, validated on real-world data provided by DiDi Chuxing,
show that the FCL-Net achieves better predictive performance than traditional
approaches including both classical time-series prediction models and neural
network based algorithms (e.g., artificial neural network and LSTM). This paper
is one of the first DL studies to forecast the short-term passenger demand of
an on-demand ride service platform by examining the spatio-temporal
correlations.Comment: 39 pages, 10 figure
Short-term Demand Forecasting for Online Car-hailing Services using Recurrent Neural Networks
Short-term traffic flow prediction is one of the crucial issues in
intelligent transportation system, which is an important part of smart cities.
Accurate predictions can enable both the drivers and the passengers to make
better decisions about their travel route, departure time and travel origin
selection, which can be helpful in traffic management. Multiple models and
algorithms based on time series prediction and machine learning were applied to
this issue and achieved acceptable results. Recently, the availability of
sufficient data and computational power, motivates us to improve the prediction
accuracy via deep-learning approaches. Recurrent neural networks have become
one of the most popular methods for time series forecasting, however, due to
the variety of these networks, the question that which type is the most
appropriate one for this task remains unsolved. In this paper, we use three
kinds of recurrent neural networks including simple RNN units, GRU and LSTM
neural network to predict short-term traffic flow. The dataset from TAP30
Corporation is used for building the models and comparing RNNs with several
well-known models, such as DEMA, LASSO and XGBoost. The results show that all
three types of RNNs outperform the others, however, more simple RNNs such as
simple recurrent units and GRU perform work better than LSTM in terms of
accuracy and training time.Comment: arXiv admin note: text overlap with arXiv:1706.06279,
arXiv:1804.04176 by other author
- …