592 research outputs found

    Traffic Prediction Based on Random Connectivity in Deep Learning with Long Short-Term Memory

    Full text link
    Traffic prediction plays an important role in evaluating the performance of telecommunication networks and attracts intense research interests. A significant number of algorithms and models have been put forward to analyse traffic data and make prediction. In the recent big data era, deep learning has been exploited to mine the profound information hidden in the data. In particular, Long Short-Term Memory (LSTM), one kind of Recurrent Neural Network (RNN) schemes, has attracted a lot of attentions due to its capability of processing the long-range dependency embedded in the sequential traffic data. However, LSTM has considerable computational cost, which can not be tolerated in tasks with stringent latency requirement. In this paper, we propose a deep learning model based on LSTM, called Random Connectivity LSTM (RCLSTM). Compared to the conventional LSTM, RCLSTM makes a notable breakthrough in the formation of neural network, which is that the neurons are connected in a stochastic manner rather than full connected. So, the RCLSTM, with certain intrinsic sparsity, have many neural connections absent (distinguished from the full connectivity) and which leads to the reduction of the parameters to be trained and the computational cost. We apply the RCLSTM to predict traffic and validate that the RCLSTM with even 35% neural connectivity still shows a satisfactory performance. When we gradually add training samples, the performance of RCLSTM becomes increasingly closer to the baseline LSTM. Moreover, for the input traffic sequences of enough length, the RCLSTM exhibits even superior prediction accuracy than the baseline LSTM.Comment: 6 pages, 9 figure

    Deep Learning with Long Short-Term Memory for Time Series Prediction

    Full text link
    Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas Long Short-Term Memory (LSTM) solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of the LSTM model. Then, aiming at reducing the considerable computing cost of LSTM, we put forward the Random Connectivity LSTM (RCLSTM) model and test it by predicting traffic and user mobility in telecommunication networks. Compared to LSTM, RCLSTM is formed via stochastic connectivity between neurons, which achieves a significant breakthrough in the architecture formation of neural networks. In this way, the RCLSTM model exhibits a certain level of sparsity, which leads to an appealing decrease in the computational complexity and makes the RCLSTM model become more applicable in latency-stringent application scenarios. In the field of telecommunication networks, the prediction of traffic series and mobility traces could directly benefit from this improvement as we further demonstrate that the prediction accuracy of RCLSTM is comparable to that of the conventional LSTM no matter how we change the number of training samples or the length of input sequences.Comment: 9 pages, 5 figures, 14 reference

    Forecasting Network Traffic: A Survey and Tutorial with Open-Source Comparative Evaluation

    Get PDF
    This paper presents a review of the literature on network traffic prediction, while also serving as a tutorial to the topic. We examine works based on autoregressive moving average models, like ARMA, ARIMA and SARIMA, as well as works based on Artifical Neural Networks approaches, such as RNN, LSTM, GRU, and CNN. In all cases, we provide a complete and self-contained presentation of the mathematical foundations of each technique, which allows the reader to get a full understanding of the operation of the different proposed methods. Further, we perform numerical experiments based on real data sets, which allows comparing the various approaches directly in terms of fitting quality and computational costs. We make our code publicly available, so that readers can readily access a wide range of forecasting tools, and possibly use them as benchmarks for more advanced solutions

    Deep Learning for Network Traffic Monitoring and Analysis (NTMA): A Survey

    Get PDF
    Modern communication systems and networks, e.g., Internet of Things (IoT) and cellular networks, generate a massive and heterogeneous amount of traffic data. In such networks, the traditional network management techniques for monitoring and data analytics face some challenges and issues, e.g., accuracy, and effective processing of big data in a real-time fashion. Moreover, the pattern of network traffic, especially in cellular networks, shows very complex behavior because of various factors, such as device mobility and network heterogeneity. Deep learning has been efficiently employed to facilitate analytics and knowledge discovery in big data systems to recognize hidden and complex patterns. Motivated by these successes, researchers in the field of networking apply deep learning models for Network Traffic Monitoring and Analysis (NTMA) applications, e.g., traffic classification and prediction. This paper provides a comprehensive review on applications of deep learning in NTMA. We first provide fundamental background relevant to our review. Then, we give an insight into the confluence of deep learning and NTMA, and review deep learning techniques proposed for NTMA applications. Finally, we discuss key challenges, open issues, and future research directions for using deep learning in NTMA applications.publishedVersio

    STG2Seq: Spatial-temporal Graph to Sequence Model for Multi-step Passenger Demand Forecasting

    Full text link
    Multi-step passenger demand forecasting is a crucial task in on-demand vehicle sharing services. However, predicting passenger demand over multiple time horizons is generally challenging due to the nonlinear and dynamic spatial-temporal dependencies. In this work, we propose to model multi-step citywide passenger demand prediction based on a graph and use a hierarchical graph convolutional structure to capture both spatial and temporal correlations simultaneously. Our model consists of three parts: 1) a long-term encoder to encode historical passenger demands; 2) a short-term encoder to derive the next-step prediction for generating multi-step prediction; 3) an attention-based output module to model the dynamic temporal and channel-wise information. Experiments on three real-world datasets show that our model consistently outperforms many baseline methods and state-of-the-art models.Comment: 7 page

    A Quantitative Comparison of Algorithmic and Machine Learning Network Flow Throughput Prediction

    Get PDF
    Applications ranging from video meetings, live streaming, video games, autonomous vehicle operations, and algorithmic trading heavily rely on low latency communication to operate optimally. A solution to fully support this growing demand for low latency is called dual-queue active queue management (AQM). Dual-queue AQM\u27s functionality is reduced without network traffic throughput prediction. Perhaps due to the current popularity of machine learning, there is a trend to adopt machine learning models over traditional algorithmic throughput prediction approaches without empirical support. This study tested the effectiveness of machine learning as compared to time series forecasting algorithms in predicting per-flow network traffic throughput on two separate datasets. It was hypothesized that a machine learning model would surpass the accuracy of an autoregressive integrated moving average algorithm when predicting future network per-flow throughput as measured by the mean absolute difference between the actual and predicted values of two independent datasets created by sampling network traffic. Autoregressive integrated moving average (ARIMA), a deep neural network (DNN) architecture, and a long short-term memory (LSTM) neural network architecture were used to predict future network throughput in two different datasets. Dataset one was used in establishing the initial performance benchmarks. Findings were replicated with a second dataset. The results showed that all three models performed well. ANOVA failed to demonstrate a statistically significant advantage of machine learning over the algorithmic model. From dataset one, ANOVA F = 0.138 and p = 0.983. From dataset two, F = 0.087 and p = 0.994. The coefficient of determination tested the fit of models in the two datasets. The r squared value ranged from 0.971 to 0.983 in the machine models to 0.759 to 0.963 in the algorithmic model. These findings show no evidence that there is a significant advantage of applying machine learning to per-flow throughput prediction in the two datasets that were tested. While machine learning has been a popular approach to throughput prediction, the effort and complexity of building such systems may instead warrant the use of algorithmic forecasting models in rapid prototyping environments. Whether these findings can be generalized to more extensive and variable datasets is a question for future research

    From statistical- to machine learning-based network traffic prediction

    Get PDF
    Nowadays, due to the exponential and continuous expansion of new paradigms such as Internet of Things (IoT), Internet of Vehicles (IoV) and 6G, the world is witnessing a tremendous and sharp increase of network traffic. In such large-scale, heterogeneous, and complex networks, the volume of transferred data, as big data, is considered a challenge causing different networking inefficiencies. To overcome these challenges, various techniques are introduced to monitor the performance of networks, called Network Traffic Monitoring and Analysis (NTMA). Network Traffic Prediction (NTP) is a significant subfield of NTMA which is mainly focused on predicting the future of network load and its behavior. NTP techniques can generally be realized in two ways, that is, statistical- and Machine Learning (ML)-based. In this paper, we provide a study on existing NTP techniques through reviewing, investigating, and classifying the recent relevant works conducted in this field. Additionally, we discuss the challenges and future directions of NTP showing that how ML and statistical techniques can be used to solve challenges of NTP.publishedVersio
    • …
    corecore