601 research outputs found
Intercell interference mitigation in long term evolution (LTE) and LTE-advanced
University of Technology Sydney. Faculty of Engineering and Information Technology.Bandwidth is one of the limited resources in Long Term Evolution (LTE) and LTE-Advanced (LTE-A) networks. Therefore, new resource allocation techniques such as the frequency reuse are needed to increase the capacity in LTE and LTE-A. However, the system performance is severely degraded using the same frequency in adjacent cells due to increase of intercell interference. Therefore, the intercell interference management is a critical point to improve the performance of the cellular mobile networks. This thesis aims to mitigate intercell interference in the downlink LTE and LTE-A networks.
The first part of this thesis introduces a new intercell interference coordination scheme to mitigate downlink intercell interference in macrocell-macrocell scenario based on user priority and using fuzzy logic system (FLS). A FLS is an expert system which maps the inputs to outputs using “IF...THEN” rules and an aggregation method. Then, the final output is obtained through a deffuzifaction approach. Since this thesis aims to mitigate interference in downlink LTE networks, the inputs of FLS are selected from important metrics such as throughput, signal to interference plus noise ratio and so on. Simulation results demonstrate the efficacy of the proposed scheme to improve the system performance in terms of cell throughput, cell edge throughput and delay when compared with reuse factor one.
Thereafter, heterogeneous networks (HetNets) are studied which are used to increase the coverage and capacity of system. The focus of the next part of this thesis is picocell because it is one of the important low power nodes in HetNets which can efficiently improve the overall system capacity and coverage. However, new challenges arise to intercell interference management in macrocell-picocell scenario. Three enhanced intercell interference coordination (eICIC) schemes are proposed in this thesis to mitigate the interference problem. In the first scheme, a dynamic cell range expansion (CRE) approach is combined with a dynamic almost blank subframe (ABS) using fuzzy logic system. In the second scheme, a fuzzy q-learning (FQL) approach is used to find the optimum ABS and CRE offset values for both full buffer traffic and video streaming traffic. In FQL, FLS is combined by q-learning approach to optimally select the best consequent part of each FLS rule. In the third proposed eICIC scheme, the best location of ABSs in each frame is determined using Genetic Algorithm such that the requirements of video streaming traffic can be met. Simulation results show that the system performance can be improved through the proposed schemes.
Finally, the optimum CRE offset value and the required number of ABSs will be mathematically formulated based on the outage probability, ergodic rate and minimum required throughput of users using stochastic geometry tool. The results are an analytical formula that leads to a good initial estimate through a simple approach to analyse the impact of system parameters on CRE offset value and number of ABSs
5G uplink interference simulations, analysis and solutions: The case of pico cells dense deployment
The launch of the new mobile network technology has paved the way for advanced and more productive industrial applications based on high-speed and low latency services offered by 5G. One of the key success points of the 5G network is the available diversity of cell deployment modes and the flexibility in radio resources allocation based on user’s needs. The concept of Pico cells will become the future of 5G as they increase the capacity and improve the network coverage at a low deployment cost. In addition, the short-range wireless transmission of this type of cells uses little energy and will allow dense applications for the internet of things. In this contribution, we present the advantages of using Pico cells and the characteristics of this type of cells in 5G networks. Then, we will do a simulation study of the interferences impact in uplink transmission in the case of PICO cells densified deployment. Finally, we will propose a solution for interference avoidance between pico cells that also allows flexible management of bands allocated to the users in uplink according to user’s density and bandwidth demand
A Framework for Uplink Intercell Interference Modeling with Channel-Based Scheduling
This paper presents a novel framework for modeling the uplink intercell
interference (ICI) in a multiuser cellular network. The proposed framework
assists in quantifying the impact of various fading channel models and
state-of-the-art scheduling schemes on the uplink ICI. Firstly, we derive a
semianalytical expression for the distribution of the location of the scheduled
user in a given cell considering a wide range of scheduling schemes. Based on
this, we derive the distribution and moment generating function (MGF) of the
uplink ICI considering a single interfering cell. Consequently, we determine
the MGF of the cumulative ICI observed from all interfering cells and derive
explicit MGF expressions for three typical fading models. Finally, we utilize
the obtained expressions to evaluate important network performance metrics such
as the outage probability, ergodic capacity, and average fairness numerically.
Monte-Carlo simulation results are provided to demonstrate the efficacy of the
derived analytical expressions.Comment: IEEE Transactions on Wireless Communications, 2013. arXiv admin note:
substantial text overlap with arXiv:1206.229
TD-SCDMA Relay Networks
PhDWhen this research was started, TD-SCDMA (Time Division Synchronous Code
Division Multiple Access) was still in the research/ development phase, but
now, at the time of writing this thesis, it is in commercial use in 10 large cities in
China including Beijing and Shang Hai. In all of these cities HSDPA is enabled.
The roll-out of the commercial deployment is progressing fast with installations
in another 28 cities being underway now.
However, during the pre-commercial TD-SCDM trail in China, which started
from year 2006, some interference problems have been noticed especially in the
network planning and initialization phases. Interference is always an issue in
any network and the goal of the work reported in this thesis is to improve
network coverage and capacity in the presence of interference.
Based on an analysis of TD-SCDMA issues and how network interference arises,
this thesis proposes two enhancements to the network in addition to the
standard N-frequency technique. These are (i) the introduction of the concentric
circle cell concept and (ii) the addition of a relay network that makes use of
other users at the cell boundary. This overall approach not only optimizes the
resilience to interference but increases the network coverage without adding
more Node Bs.
Based on the cell planning parameters from the research, TD-SCDMA HSDPA
services in dense urban area and non-HSDPA services in rural areas were
simulated to investigate the network performance impact after introducing the
relay network into a TD-SCDMA network.
The results for HSDPA applications show significant improvement in the TDSCDMA
relay network both for network capacity and network interference
aspects compared to standard TD-SCDMA networks. The results for non-
HSDPA service show that although the network capacity has not changed after
adding in the relay network (due to the code limitation in TD-SCDMA), the
TD-SCDMA relay network has better interference performance and greater
coverage
Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory
Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization
Energy-Efficient Scheduling and Power Allocation in Downlink OFDMA Networks with Base Station Coordination
This paper addresses the problem of energy-efficient resource allocation in
the downlink of a cellular OFDMA system. Three definitions of the energy
efficiency are considered for system design, accounting for both the radiated
and the circuit power. User scheduling and power allocation are optimized
across a cluster of coordinated base stations with a constraint on the maximum
transmit power (either per subcarrier or per base station). The asymptotic
noise-limited regime is discussed as a special case. %The performance of both
an isolated and a non-isolated cluster of coordinated base stations is examined
in the numerical experiments. Results show that the maximization of the energy
efficiency is approximately equivalent to the maximization of the spectral
efficiency for small values of the maximum transmit power, while there is a
wide range of values of the maximum transmit power for which a moderate
reduction of the data rate provides a large saving in terms of dissipated
energy. Also, the performance gap among the considered resource allocation
strategies reduces as the out-of-cluster interference increases.Comment: to appear on IEEE Transactions on Wireless Communication
Resource Allocation for Network-Integrated Device-to-Device Communications Using Smart Relays
With increasing number of autonomous heterogeneous devices in future mobile
networks, an efficient resource allocation scheme is required to maximize
network throughput and achieve higher spectral efficiency. In this paper,
performance of network-integrated device-to-device (D2D) communication is
investigated where D2D traffic is carried through relay nodes. An optimization
problem is formulated for allocating radio resources to maximize end-to-end
rate as well as conversing QoS requirements for cellular and D2D user equipment
under total power constraint. Numerical results show that there is a distance
threshold beyond which relay-assisted D2D communication significantly improves
network performance when compared to direct communication between D2D peers
Benchmarking Practical RRM Algorithms for D2D Communications in LTE Advanced
Device-to-device (D2D) communication integrated into cellular networks is a
means to take advantage of the proximity of devices and allow for reusing
cellular resources and thereby to increase the user bitrates and the system
capacity. However, when D2D (in the 3rd Generation Partnership Project also
called Long Term Evolution (LTE) Direct) communication in cellular spectrum is
supported, there is a need to revisit and modify the existing radio resource
management (RRM) and power control (PC) techniques to realize the potential of
the proximity and reuse gains and to limit the interference at the cellular
layer. In this paper, we examine the performance of the flexible LTE PC tool
box and benchmark it against a utility optimal iterative scheme. We find that
the open loop PC scheme of LTE performs well for cellular users both in terms
of the used transmit power levels and the achieved
signal-to-interference-and-noise-ratio (SINR) distribution. However, the
performance of the D2D users as well as the overall system throughput can be
boosted by the utility optimal scheme, because the utility maximizing scheme
takes better advantage of both the proximity and the reuse gains. Therefore, in
this paper we propose a hybrid PC scheme, in which cellular users employ the
open loop path compensation method of LTE, while D2D users use the utility
optimizing distributed PC scheme. In order to protect the cellular layer, the
hybrid scheme allows for limiting the interference caused by the D2D layer at
the cost of having a small impact on the performance of the D2D layer. To
ensure feasibility, we limit the number of iterations to a practically feasible
level. We make the point that the hybrid scheme is not only near optimal, but
it also allows for a distributed implementation for the D2D users, while
preserving the LTE PC scheme for the cellular users.Comment: 30 pages, submitted for review April-2013. See also: G. Fodor, M.
Johansson, D. P. Demia, B. Marco, and A. Abrardo, A joint power control and
resource allocation algorithm for D2D communications, KTH, Automatic Control,
Tech. Rep., 2012, qC 20120910,
http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-10205
Inter cell interference modeling and analysis in long term evolution (LTE)
Long Term Evolution (LTE) is promising standard for data rate and system capacity (coverage) in wirelesses communication history. Inter cell interference (ICI) is a dominate impairment of LTE systems. In interference, modeling and analysis have focused on the first tier only, but considering ICI arises from the second tier is very crucial in communication system standard and implementations too. In this research work interference modeling is proposed and analyzed for the first two subsequent tires under various channel environments. The work is analytically quantified with respect to standard parameters and system models using Matlab and other supportive tools. Thus, link level simulation results demonstrate, ICI beyond the first tier becomes active in urban environment and the smaller cell size increases ICI power in LTE
- …