86 research outputs found
TACT: A Transfer Actor-Critic Learning Framework for Energy Saving in Cellular Radio Access Networks
Recent works have validated the possibility of improving energy efficiency in
radio access networks (RANs), achieved by dynamically turning on/off some base
stations (BSs). In this paper, we extend the research over BS switching
operations, which should match up with traffic load variations. Instead of
depending on the dynamic traffic loads which are still quite challenging to
precisely forecast, we firstly formulate the traffic variations as a Markov
decision process. Afterwards, in order to foresightedly minimize the energy
consumption of RANs, we design a reinforcement learning framework based BS
switching operation scheme. Furthermore, to avoid the underlying curse of
dimensionality in reinforcement learning, a transfer actor-critic algorithm
(TACT), which utilizes the transferred learning expertise in historical periods
or neighboring regions, is proposed and provably converges. In the end, we
evaluate our proposed scheme by extensive simulations under various practical
configurations and show that the proposed TACT algorithm contributes to a
performance jumpstart and demonstrates the feasibility of significant energy
efficiency improvement at the expense of tolerable delay performance.Comment: 11 figures, 30 pages, accepted in IEEE Transactions on Wireless
Communications 2014. IEEE Trans. Wireless Commun., Feb. 201
Two-tier Spatial Modeling of Base Stations in Cellular Networks
Poisson Point Process (PPP) has been widely adopted as an efficient model for
the spatial distribution of base stations (BSs) in cellular networks. However,
real BSs deployment are rarely completely random, due to environmental impact
on actual site planning. Particularly, for multi-tier heterogeneous cellular
networks, operators have to place different BSs according to local coverage and
capacity requirement, and the diversity of BSs' functions may result in
different spatial patterns on each networking tier. In this paper, we consider
a two-tier scenario that consists of macrocell and microcell BSs in cellular
networks. By analyzing these two tiers separately and applying both classical
statistics and network performance as evaluation metrics, we obtain accurate
spatial model of BSs deployment for each tier. Basically, we verify the
inaccuracy of using PPP in BS locations modeling for either macrocells or
microcells. Specifically, we find that the first tier with macrocell BSs is
dispersed and can be precisely modelled by Strauss point process, while Matern
cluster process captures the second tier's aggregation nature very well. These
statistical models coincide with the inherent properties of macrocell and
microcell BSs respectively, thus providing a new perspective in understanding
the relationship between spatial structure and operational functions of BSs
Traffic Prediction Based on Random Connectivity in Deep Learning with Long Short-Term Memory
Traffic prediction plays an important role in evaluating the performance of
telecommunication networks and attracts intense research interests. A
significant number of algorithms and models have been put forward to analyse
traffic data and make prediction. In the recent big data era, deep learning has
been exploited to mine the profound information hidden in the data. In
particular, Long Short-Term Memory (LSTM), one kind of Recurrent Neural Network
(RNN) schemes, has attracted a lot of attentions due to its capability of
processing the long-range dependency embedded in the sequential traffic data.
However, LSTM has considerable computational cost, which can not be tolerated
in tasks with stringent latency requirement. In this paper, we propose a deep
learning model based on LSTM, called Random Connectivity LSTM (RCLSTM).
Compared to the conventional LSTM, RCLSTM makes a notable breakthrough in the
formation of neural network, which is that the neurons are connected in a
stochastic manner rather than full connected. So, the RCLSTM, with certain
intrinsic sparsity, have many neural connections absent (distinguished from the
full connectivity) and which leads to the reduction of the parameters to be
trained and the computational cost. We apply the RCLSTM to predict traffic and
validate that the RCLSTM with even 35% neural connectivity still shows a
satisfactory performance. When we gradually add training samples, the
performance of RCLSTM becomes increasingly closer to the baseline LSTM.
Moreover, for the input traffic sequences of enough length, the RCLSTM exhibits
even superior prediction accuracy than the baseline LSTM.Comment: 6 pages, 9 figure
Deep Learning with Long Short-Term Memory for Time Series Prediction
Time series prediction can be generalized as a process that extracts useful
information from historical records and then determines future values. Learning
long-range dependencies that are embedded in time series is often an obstacle
for most algorithms, whereas Long Short-Term Memory (LSTM) solutions, as a
specific kind of scheme in deep learning, promise to effectively overcome the
problem. In this article, we first give a brief introduction to the structure
and forward propagation mechanism of the LSTM model. Then, aiming at reducing
the considerable computing cost of LSTM, we put forward the Random Connectivity
LSTM (RCLSTM) model and test it by predicting traffic and user mobility in
telecommunication networks. Compared to LSTM, RCLSTM is formed via stochastic
connectivity between neurons, which achieves a significant breakthrough in the
architecture formation of neural networks. In this way, the RCLSTM model
exhibits a certain level of sparsity, which leads to an appealing decrease in
the computational complexity and makes the RCLSTM model become more applicable
in latency-stringent application scenarios. In the field of telecommunication
networks, the prediction of traffic series and mobility traces could directly
benefit from this improvement as we further demonstrate that the prediction
accuracy of RCLSTM is comparable to that of the conventional LSTM no matter how
we change the number of training samples or the length of input sequences.Comment: 9 pages, 5 figures, 14 reference
Characterizing Spatial Patterns of Base Stations in Cellular Networks
The topology of base stations (BSs) in cellular networks, serving as a basis
of networking performance analysis, is considered to be obviously distinctive
with the traditional hexagonal grid or square lattice model, thus stimulating a
fundamental rethinking. Recently, stochastic geometry based models, especially
the Poisson point process (PPP), attracts an ever-increasing popularity in
modeling BS deployment of cellular networks due to its merits of tractability
and capability for capturing nonuniformity. In this study, a detailed
comparison between common stochastic models and real BS locations is performed.
Results indicate that the PPP fails to precisely characterize either urban or
rural BS deployment. Furthermore, the topology of real data in both regions are
examined and distinguished by statistical methods according to the point
interaction trends they exhibit. By comparing the corresponding real data with
aggregative point process models as well as repulsive point process models, we
verify that the capacity-centric deployment in urban areas can be modeled by
typical aggregative processes such as the Matern cluster process, while the
coverage-centric deployment in rural areas can be modeled by representativ
- …