759 research outputs found
Interference Management Based on RT/nRT Traffic Classification for FFR-Aided Small Cell/Macrocell Heterogeneous Networks
Cellular networks are constantly lagging in terms of the bandwidth needed to
support the growing high data rate demands. The system needs to efficiently
allocate its frequency spectrum such that the spectrum utilization can be
maximized while ensuring the quality of service (QoS) level. Owing to the
coexistence of different types of traffic (e.g., real-time (RT) and
non-real-time (nRT)) and different types of networks (e.g., small cell and
macrocell), ensuring the QoS level for different types of users becomes a
challenging issue in wireless networks. Fractional frequency reuse (FFR) is an
effective approach for increasing spectrum utilization and reducing
interference effects in orthogonal frequency division multiple access networks.
In this paper, we propose a new FFR scheme in which bandwidth allocation is
based on RT/nRT traffic classification. We consider the coexistence of small
cells and macrocells. After applying FFR technique in macrocells, the remaining
frequency bands are efficiently allocated among the small cells overlaid by a
macrocell. In our proposed scheme, total frequency-band allocations for
different macrocells are decided on the basis of the traffic intensity. The
transmitted power levels for different frequency bands are controlled based on
the level of interference from a nearby frequency band. Frequency bands with a
lower level of interference are assigned to the RT traffic to ensure a higher
QoS level for the RT traffic. RT traffic calls in macrocell networks are also
given a higher priority compared with nRT traffic calls to ensure the low
call-blocking rate. Performance analyses show significant improvement under the
proposed scheme compared with conventional FFR schemes
Hybrid Spectrum Allocation Scheme in Wireless Cellular Networks
Mobile services have seen a major upswing driven by the bandwidth hungry applications thus leading to higher data rate requirements on the wireless networks. Spectrum being the most precious resource in the wireless industry is of keen interest. Various spectrum assignment and frequency reuse schemes have been proposed in literature. However in future networks, dynamic schemes that adapt to spatio-temporal variation in the environment are desired. We thus present a hybrid spectrum assignment scheme which adapts its allocation strategies depending on user distribution in the system. Results show that the proposed dynamic spectrum assignment strategy improves spectrum utilization thereby providing a higher data rate for the users
A Novel Multiobjective Cell Switch-Off Framework for Cellular Networks
Cell Switch-Off (CSO) is recognized as a promising approach to reduce the
energy consumption in next-generation cellular networks. However, CSO poses
serious challenges not only from the resource allocation perspective but also
from the implementation point of view. Indeed, CSO represents a difficult
optimization problem due to its NP-complete nature. Moreover, there are a
number of important practical limitations in the implementation of CSO schemes,
such as the need for minimizing the real-time complexity and the number of
on-off/off-on transitions and CSO-induced handovers. This article introduces a
novel approach to CSO based on multiobjective optimization that makes use of
the statistical description of the service demand (known by operators). In
addition, downlink and uplink coverage criteria are included and a comparative
analysis between different models to characterize intercell interference is
also presented to shed light on their impact on CSO. The framework
distinguishes itself from other proposals in two ways: 1) The number of
on-off/off-on transitions as well as handovers are minimized, and 2) the
computationally-heavy part of the algorithm is executed offline, which makes
its implementation feasible. The results show that the proposed scheme achieves
substantial energy savings in small cell deployments where service demand is
not uniformly distributed, without compromising the Quality-of-Service (QoS) or
requiring heavy real-time processing
Joint Routing and STDMA-based Scheduling to Minimize Delays in Grid Wireless Sensor Networks
In this report, we study the issue of delay optimization and energy
efficiency in grid wireless sensor networks (WSNs). We focus on STDMA (Spatial
Reuse TDMA)) scheduling, where a predefined cycle is repeated, and where each
node has fixed transmission opportunities during specific slots (defined by
colors). We assume a STDMA algorithm that takes advantage of the regularity of
grid topology to also provide a spatially periodic coloring ("tiling" of the
same color pattern). In this setting, the key challenges are: 1) minimizing the
average routing delay by ordering the slots in the cycle 2) being energy
efficient. Our work follows two directions: first, the baseline performance is
evaluated when nothing specific is done and the colors are randomly ordered in
the STDMA cycle. Then, we propose a solution, ORCHID that deliberately
constructs an efficient STDMA schedule. It proceeds in two steps. In the first
step, ORCHID starts form a colored grid and builds a hierarchical routing based
on these colors. In the second step, ORCHID builds a color ordering, by
considering jointly both routing and scheduling so as to ensure that any node
will reach a sink in a single STDMA cycle. We study the performance of these
solutions by means of simulations and modeling. Results show the excellent
performance of ORCHID in terms of delays and energy compared to a shortest path
routing that uses the delay as a heuristic. We also present the adaptation of
ORCHID to general networks under the SINR interference model
A SON Solution for Sleeping Cell Detection Using Low-Dimensional Embedding of MDT Measurements
Automatic detection of cells which are in outage has been identified as one of the key use cases for Self Organizing Networks (SON) for emerging and future generations of cellular systems. A special case of cell outage, referred to as Sleeping Cell (SC) remains particularly challenging to detect in state of the art SON because in this case cell goes into outage or may perform poorly without triggering an alarm for Operation and Maintenance (O&M) entity. Consequently, no SON compensation function can be launched unless SC situation is detected via drive tests or through complaints registered by the affected customers. In this paper, we present a novel solution to address this problem that makes use of minimization of drive test (MDT) measurements recently standardized by 3GPP and NGMN. To overcome the processing complexity challenge, the MDT measurements are projected to a low-dimensional space using multidimensional scaling method. Then we apply state of the art k-nearest neighbor and local outlier factor based anomaly detection models together with pre-processed MDT measurements to profile the network behaviour and to detect SC. Our numerical results show that our proposed solution can automate the SC detection process with 93 accuracy
A cell outage management framework for dense heterogeneous networks
In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner
- …