411,848 research outputs found
NeuTM: A Neural Network-based Framework for Traffic Matrix Prediction in SDN
This paper presents NeuTM, a framework for network Traffic Matrix (TM)
prediction based on Long Short-Term Memory Recurrent Neural Networks (LSTM
RNNs). TM prediction is defined as the problem of estimating future network
traffic matrix from the previous and achieved network traffic data. It is
widely used in network planning, resource management and network security. Long
Short-Term Memory (LSTM) is a specific recurrent neural network (RNN)
architecture that is well-suited to learn from data and classify or predict
time series with time lags of unknown size. LSTMs have been shown to model
long-range dependencies more accurately than conventional RNNs. NeuTM is a LSTM
RNN-based framework for predicting TM in large networks. By validating our
framework on real-world data from GEEANT network, we show that our model
converges quickly and gives state of the art TM prediction performance.Comment: Submitted to NOMS18. arXiv admin note: substantial text overlap with
arXiv:1705.0569
Prediction in cultured cortical neural networks
Theory suggest that networks of neurons may predict their input. Prediction may underlie most aspects of information processing and is believed to be involved in motor and cognitive control and decision-making. Retinal cells have been shown to be capable of predicting visual stimuli, and there is some evidence for prediction of input in the visual cortex and hippocampus. However, there is no proof that the ability to predict is a generic feature of neural networks. We investigated whether random in vitro neuronal networks can predict stimulation, and how prediction is related to short- and long-term memory. To answer these questions, we applied two different stimulation modalities. Focal electrical stimulation has been shown to induce long-term memory traces, whereas global optogenetic stimulation did not. We used mutual information to quantify how much activity recorded from these networks reduces the uncertainty of upcoming stimuli (prediction) or recent past stimuli (short-term memory). Cortical neural networks did predict future stimuli, with the majority of all predictive information provided by the immediate network response to the stimulus. Interestingly, prediction strongly depended on short-term memory of recent sensory inputs during focal as well as global stimulation. However, prediction required less short-term memory during focal stimulation. Furthermore, the dependency on short-term memory decreased during 20 h of focal stimulation, when long-term connectivity changes were induced. These changes are fundamental for long-term memory formation, suggesting that besides short-term memory the formation of long-term memory traces may play a role in efficient prediction.</p
A Novel Distributed Representation of News (DRNews) for Stock Market Predictions
In this study, a novel Distributed Representation of News (DRNews) model is
developed and applied in deep learning-based stock market predictions. With the
merit of integrating contextual information and cross-documental knowledge, the
DRNews model creates news vectors that describe both the semantic information
and potential linkages among news events through an attributed news network.
Two stock market prediction tasks, namely the short-term stock movement
prediction and stock crises early warning, are implemented in the framework of
the attention-based Long Short Term-Memory (LSTM) network. It is suggested that
DRNews substantially enhances the results of both tasks comparing with five
baselines of news embedding models. Further, the attention mechanism suggests
that short-term stock trend and stock market crises both receive influences
from daily news with the former demonstrates more critical responses on the
information related to the stock market {\em per se}, whilst the latter draws
more concerns on the banking sector and economic policies.Comment: 25 page
Community Detection and Growth Potential Prediction from Patent Citation Networks
The scoring of patents is useful for technology management analysis.
Therefore, a necessity of developing citation network clustering and prediction
of future citations for practical patent scoring arises. In this paper, we
propose a community detection method using the Node2vec. And in order to
analyze growth potential we compare three ''time series analysis methods'', the
Long Short-Term Memory (LSTM), ARIMA model, and Hawkes Process. The results of
our experiments, we could find common technical points from those clusters by
Node2vec. Furthermore, we found that the prediction accuracy of the ARIMA model
was higher than that of other models.Comment: arXiv admin note: text overlap with arXiv:1607.00653 by other author
Recognizing and Curating Photo Albums via Event-Specific Image Importance
Automatic organization of personal photos is a problem with many real world
ap- plications, and can be divided into two main tasks: recognizing the event
type of the photo collection, and selecting interesting images from the
collection. In this paper, we attempt to simultaneously solve both tasks:
album-wise event recognition and image- wise importance prediction. We
collected an album dataset with both event type labels and image importance
labels, refined from an existing CUFED dataset. We propose a hybrid system
consisting of three parts: A siamese network-based event-specific image
importance prediction, a Convolutional Neural Network (CNN) that recognizes the
event type, and a Long Short-Term Memory (LSTM)-based sequence level event
recognizer. We propose an iterative updating procedure for event type and image
importance score prediction. We experimentally verified that image importance
score prediction and event type recognition can each help the performance of
the other.Comment: Accepted as oral in BMVC 201
A Long Short-Term Memory Recurrent Neural Network Framework for Network Traffic Matrix Prediction
Network Traffic Matrix (TM) prediction is defined as the problem of
estimating future network traffic from the previous and achieved network
traffic data. It is widely used in network planning, resource management and
network security. Long Short-Term Memory (LSTM) is a specific recurrent neural
network (RNN) architecture that is well-suited to learn from experience to
classify, process and predict time series with time lags of unknown size. LSTMs
have been shown to model temporal sequences and their long-range dependencies
more accurately than conventional RNNs. In this paper, we propose a LSTM RNN
framework for predicting short and long term Traffic Matrix (TM) in large
networks. By validating our framework on real-world data from GEANT network, we
show that our LSTM models converge quickly and give state of the art TM
prediction performance for relatively small sized models.Comment: Submitted for peer review. arXiv admin note: text overlap with
arXiv:1402.1128 by other author
- …