10,323 research outputs found
Context-Aware Resource Allocation in Cellular Networks
We define and propose a resource allocation architecture for cellular
networks. The architecture combines content-aware, time-aware and
location-aware resource allocation for next generation broadband wireless
systems. The architecture ensures content-aware resource allocation by
prioritizing real-time applications users over delay-tolerant applications
users when allocating resources. It enables time-aware resource allocation via
traffic-dependent pricing that varies during different hours of day (e.g. peak
and off-peak traffic hours). Additionally, location-aware resource allocation
is integrable in this architecture by including carrier aggregation of various
frequency bands. The context-aware resource allocation is an optimal and
flexible architecture that can be easily implemented in practical cellular
networks. We highlight the advantages of the proposed network architecture with
a discussion on the future research directions for context-aware resource
allocation architecture. We also provide experimental results to illustrate a
general proof of concept for this new architecture.Comment: (c) 2015 IEEE. Personal use of this material is permitted. Permission
from IEEE must be obtained for all other uses, in any current or future
media, including reprinting/republishing this material for advertising or
promotional purposes, creating new collective works, for resale or
redistribution to servers or lists, or reuse of any copyrighted component of
this work in other work
Traffic-Driven Spectrum Allocation in Heterogeneous Networks
Next generation cellular networks will be heterogeneous with dense deployment
of small cells in order to deliver high data rate per unit area. Traffic
variations are more pronounced in a small cell, which in turn lead to more
dynamic interference to other cells. It is crucial to adapt radio resource
management to traffic conditions in such a heterogeneous network (HetNet). This
paper studies the optimization of spectrum allocation in HetNets on a
relatively slow timescale based on average traffic and channel conditions
(typically over seconds or minutes). Specifically, in a cluster with base
transceiver stations (BTSs), the optimal partition of the spectrum into
segments is determined, corresponding to all possible spectrum reuse patterns
in the downlink. Each BTS's traffic is modeled using a queue with Poisson
arrivals, the service rate of which is a linear function of the combined
bandwidth of all assigned spectrum segments. With the system average packet
sojourn time as the objective, a convex optimization problem is first
formulated, where it is shown that the optimal allocation divides the spectrum
into at most segments. A second, refined model is then proposed to address
queue interactions due to interference, where the corresponding optimal
allocation problem admits an efficient suboptimal solution. Both allocation
schemes attain the entire throughput region of a given network. Simulation
results show the two schemes perform similarly in the heavy-traffic regime, in
which case they significantly outperform both the orthogonal allocation and the
full-frequency-reuse allocation. The refined allocation shows the best
performance under all traffic conditions.Comment: 13 pages, 11 figures, accepted for publication by JSAC-HC
Nearly Optimal Resource Allocation for Downlink OFDMA in 2-D Cellular Networks
In this paper, we propose a resource allocation algorithm for the downlink of
sectorized two-dimensional (2-D) OFDMA cellular networks assuming statistical
Channel State Information (CSI) and fractional frequency reuse. The proposed
algorithm can be implemented in a distributed fashion without the need to any
central controlling units. Its performance is analyzed assuming fast fading
Rayleigh channels and Gaussian distributed multicell interference. We show that
the transmit power of this simple algorithm tends, as the number of users grows
to infinity, to the same limit as the minimal power required to satisfy all
users' rate requirements i.e., the proposed resource allocation algorithm is
asymptotically optimal. As a byproduct of this asymptotic analysis, we
characterize a relevant value of the reuse factor that only depends on an
average state of the network.Comment: submitted to IEEE Transactions on Wireless Communication
Benchmarking Practical RRM Algorithms for D2D Communications in LTE Advanced
Device-to-device (D2D) communication integrated into cellular networks is a
means to take advantage of the proximity of devices and allow for reusing
cellular resources and thereby to increase the user bitrates and the system
capacity. However, when D2D (in the 3rd Generation Partnership Project also
called Long Term Evolution (LTE) Direct) communication in cellular spectrum is
supported, there is a need to revisit and modify the existing radio resource
management (RRM) and power control (PC) techniques to realize the potential of
the proximity and reuse gains and to limit the interference at the cellular
layer. In this paper, we examine the performance of the flexible LTE PC tool
box and benchmark it against a utility optimal iterative scheme. We find that
the open loop PC scheme of LTE performs well for cellular users both in terms
of the used transmit power levels and the achieved
signal-to-interference-and-noise-ratio (SINR) distribution. However, the
performance of the D2D users as well as the overall system throughput can be
boosted by the utility optimal scheme, because the utility maximizing scheme
takes better advantage of both the proximity and the reuse gains. Therefore, in
this paper we propose a hybrid PC scheme, in which cellular users employ the
open loop path compensation method of LTE, while D2D users use the utility
optimizing distributed PC scheme. In order to protect the cellular layer, the
hybrid scheme allows for limiting the interference caused by the D2D layer at
the cost of having a small impact on the performance of the D2D layer. To
ensure feasibility, we limit the number of iterations to a practically feasible
level. We make the point that the hybrid scheme is not only near optimal, but
it also allows for a distributed implementation for the D2D users, while
preserving the LTE PC scheme for the cellular users.Comment: 30 pages, submitted for review April-2013. See also: G. Fodor, M.
Johansson, D. P. Demia, B. Marco, and A. Abrardo, A joint power control and
resource allocation algorithm for D2D communications, KTH, Automatic Control,
Tech. Rep., 2012, qC 20120910,
http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-10205
Interference Management Based on RT/nRT Traffic Classification for FFR-Aided Small Cell/Macrocell Heterogeneous Networks
Cellular networks are constantly lagging in terms of the bandwidth needed to
support the growing high data rate demands. The system needs to efficiently
allocate its frequency spectrum such that the spectrum utilization can be
maximized while ensuring the quality of service (QoS) level. Owing to the
coexistence of different types of traffic (e.g., real-time (RT) and
non-real-time (nRT)) and different types of networks (e.g., small cell and
macrocell), ensuring the QoS level for different types of users becomes a
challenging issue in wireless networks. Fractional frequency reuse (FFR) is an
effective approach for increasing spectrum utilization and reducing
interference effects in orthogonal frequency division multiple access networks.
In this paper, we propose a new FFR scheme in which bandwidth allocation is
based on RT/nRT traffic classification. We consider the coexistence of small
cells and macrocells. After applying FFR technique in macrocells, the remaining
frequency bands are efficiently allocated among the small cells overlaid by a
macrocell. In our proposed scheme, total frequency-band allocations for
different macrocells are decided on the basis of the traffic intensity. The
transmitted power levels for different frequency bands are controlled based on
the level of interference from a nearby frequency band. Frequency bands with a
lower level of interference are assigned to the RT traffic to ensure a higher
QoS level for the RT traffic. RT traffic calls in macrocell networks are also
given a higher priority compared with nRT traffic calls to ensure the low
call-blocking rate. Performance analyses show significant improvement under the
proposed scheme compared with conventional FFR schemes
Cooperative Interference Control for Spectrum Sharing in OFDMA Cellular Systems
This paper studies cooperative schemes for the inter-cell interference
control in orthogonal-frequency-divisionmultiple- access (OFDMA) cellular
systems. The downlink transmission in a simplified two-cell system is examined,
where both cells simultaneously access the same frequency band using OFDMA. The
joint power and subcarrier allocation over the two cells is investigated for
maximizing their sum throughput with both centralized and decentralized
implementations. Particularly, the decentralized allocation is achieved via a
new cooperative interference control approach, whereby the two cells
independently implement resource allocation to maximize individual throughput
in an iterative manner, subject to a set of mutual interference power
constraints. Simulation results show that the proposed decentralized resource
allocation schemes achieve the system throughput close to that by the
centralized scheme, and provide substantial throughput gains over existing
schemes.Comment: To appear in ICC201
- …