1,778 research outputs found
Context-aware Cluster Based Device-to-Device Communication to Serve Machine Type Communications
Billions of Machine Type Communication (MTC) devices are foreseen to be
deployed in next ten years and therefore potentially open a new market for next
generation wireless network. However, MTC applications have different
characteristics and requirements compared with the services provided by legacy
cellular networks. For instance, an MTC device sporadically requires to
transmit a small data packet containing information generated by sensors. At
the same time, due to the massive deployment of MTC devices, it is inefficient
to charge their batteries manually and thus a long battery life is required for
MTC devices. In this sense, legacy networks designed to serve human-driven
traffics in real time can not support MTC efficiently. In order to improve the
availability and battery life of MTC devices, context-aware device-to-device
(D2D) communication is exploited in this paper. By applying D2D communication,
some MTC users can serve as relays for other MTC users who experience bad
channel conditions. Moreover, signaling schemes are also designed to enable the
collection of context information and support the proposed D2D communication
scheme. Last but not least, a system level simulator is implemented to evaluate
the system performance of the proposed technologies and a large performance
gain is shown by the numerical results
Quantifying Potential Energy Efficiency Gain in Green Cellular Wireless Networks
Conventional cellular wireless networks were designed with the purpose of
providing high throughput for the user and high capacity for the service
provider, without any provisions of energy efficiency. As a result, these
networks have an enormous Carbon footprint. In this paper, we describe the
sources of the inefficiencies in such networks. First we present results of the
studies on how much Carbon footprint such networks generate. We also discuss
how much more mobile traffic is expected to increase so that this Carbon
footprint will even increase tremendously more. We then discuss specific
sources of inefficiency and potential sources of improvement at the physical
layer as well as at higher layers of the communication protocol hierarchy. In
particular, considering that most of the energy inefficiency in cellular
wireless networks is at the base stations, we discuss multi-tier networks and
point to the potential of exploiting mobility patterns in order to use base
station energy judiciously. We then investigate potential methods to reduce
this inefficiency and quantify their individual contributions. By a
consideration of the combination of all potential gains, we conclude that an
improvement in energy consumption in cellular wireless networks by two orders
of magnitude, or even more, is possible.Comment: arXiv admin note: text overlap with arXiv:1210.843
Dynamic Time-domain Duplexing for Self-backhauled Millimeter Wave Cellular Networks
Millimeter wave (mmW) bands between 30 and 300 GHz have attracted
considerable attention for next-generation cellular networks due to vast
quantities of available spectrum and the possibility of very high-dimensional
antenna ar-rays. However, a key issue in these systems is range: mmW signals
are extremely vulnerable to shadowing and poor high-frequency propagation.
Multi-hop relaying is therefore a natural technology for such systems to
improve cell range and cell edge rates without the addition of wired access
points. This paper studies the problem of scheduling for a simple
infrastructure cellular relay system where communication between wired base
stations and User Equipment follow a hierarchical tree structure through fixed
relay nodes. Such a systems builds naturally on existing cellular mmW backhaul
by adding mmW in the access links. A key feature of the proposed system is that
TDD duplexing selections can be made on a link-by-link basis due to directional
isolation from other links. We devise an efficient, greedy algorithm for
centralized scheduling that maximizes network utility by jointly optimizing the
duplexing schedule and resources allocation for dense, relay-enhanced OFDMA/TDD
mmW networks. The proposed algorithm can dynamically adapt to loading, channel
conditions and traffic demands. Significant throughput gains and improved
resource utilization offered by our algorithm over the static,
globally-synchronized TDD patterns are demonstrated through simulations based
on empirically-derived channel models at 28 GHz.Comment: IEEE Workshop on Next Generation Backhaul/Fronthaul Networks -
BackNets 201
Network Formation Games Among Relay Stations in Next Generation Wireless Networks
The introduction of relay station (RS) nodes is a key feature in next
generation wireless networks such as 3GPP's long term evolution advanced
(LTE-Advanced), or the forthcoming IEEE 802.16j WiMAX standard. This paper
presents, using game theory, a novel approach for the formation of the tree
architecture that connects the RSs and their serving base station in the
\emph{uplink} of the next generation wireless multi-hop systems. Unlike
existing literature which mainly focused on performance analysis, we propose a
distributed algorithm for studying the \emph{structure} and \emph{dynamics} of
the network. We formulate a network formation game among the RSs whereby each
RS aims to maximize a cross-layer utility function that takes into account the
benefit from cooperative transmission, in terms of reduced bit error rate, and
the costs in terms of the delay due to multi-hop transmission. For forming the
tree structure, a distributed myopic algorithm is devised. Using the proposed
algorithm, each RS can individually select the path that connects it to the BS
through other RSs while optimizing its utility. We show the convergence of the
algorithm into a Nash tree network, and we study how the RSs can adapt the
network's topology to environmental changes such as mobility or the deployment
of new mobile stations. Simulation results show that the proposed algorithm
presents significant gains in terms of average utility per mobile station which
is at least 17.1% better relatively to the case with no RSs and reaches up to
40.3% improvement compared to a nearest neighbor algorithm (for a network with
10 RSs). The results also show that the average number of hops does not exceed
3 even for a network with up to 25 RSs.Comment: IEEE Transactions on Communications, vol. 59, no. 9, pp. 2528-2542,
September 201
A survey of self organisation in future cellular networks
This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks
Energy efficiency and interference management in long term evolution-advanced networks.
Doctoral Degree. University of KwaZulu-Natal, Durban.Cellular networks are continuously undergoing fast extraordinary evolution to overcome
technological challenges. The fourth generation (4G) or Long Term Evolution-Advanced
(LTE-Advanced) networks offer improvements in performance through increase in network density,
while allowing self-organisation and self-healing. The LTE-Advanced architecture is heterogeneous,
consisting of different radio access technologies (RATs), such as macrocell, smallcells, cooperative
relay nodes (RNs), having various capabilities, and coexisting in the same geographical coverage
area. These network improvements come with different challenges that affect users’ quality of
service (QoS) and network performance. These challenges include; interference management, high
energy consumption and poor coverage of marginal users. Hence, developing mitigation schemes for
these identified challenges is the focus of this thesis.
The exponential growth of mobile broadband data usage and poor networks’ performance along
the cell edges, result in a large increase of the energy consumption for both base stations (BSs) and
users. This due to improper RN placement or deployment that creates severe inter-cell and intracell
interferences in the networks. It is therefore, necessary to investigate appropriate RN placement
techniques which offer efficient coverage extension while reducing energy consumption and mitigating
interference in LTE-Advanced femtocell networks. This work proposes energy efficient and optimal
RN placement (EEORNP) algorithm based on greedy algorithm to assure improved and effective
coverage extension. The performance of the proposed algorithm is investigated in terms of coverage
percentage and number of RN needed to cover marginalised users and found to outperform other RN
placement schemes.
Transceiver design has gained importance as one of the effective tools of interference
management. Centralised transceiver design techniques have been used to improve network
performance for LTE-Advanced networks in terms of mean square error (MSE), bit error rate (BER)
and sum-rate. The centralised transceiver design techniques are not effective and computationally
feasible for distributed cooperative heterogeneous networks, the systems considered in this thesis.
This work proposes decentralised transceivers design based on the least-square (LS) and minimum MSE (MMSE) pilot-aided channel estimations for interference management in uplink
LTE-Advanced femtocell networks. The decentralised transceiver algorithms are designed for the
femtocells, the macrocell user equipments (MUEs), RNs and the cell edge macrocell UEs (CUEs) in
the half-duplex cooperative relaying systems. The BER performances of the proposed algorithms
with the effect of channel estimation are investigated.
Finally, the EE optimisation is investigated in half-duplex multi-user multiple-input
multiple-output (MU-MIMO) relay systems. The EE optimisation is divided into sub-optimal EE
problems due to the distributed architecture of the MU-MIMO relay systems. The decentralised
approach is employed to design the transceivers such as MUEs, CUEs, RN and femtocells for the
different sub-optimal EE problems. The EE objective functions are formulated as convex
optimisation problems subject to the QoS and transmit powers constraints in case of perfect channel
state information (CSI). The non-convexity of the formulated EE optimisation problems is
surmounted by introducing the EE parameter substractive function into each proposed algorithms.
These EE parameters are updated using the Dinkelbach’s algorithm. The EE optimisation of the
proposed algorithms is achieved after finding the optimal transceivers where the unknown
interference terms in the transmit signals are designed with the zero-forcing (ZF) assumption and
estimation errors are added to improve the EE performances. With the aid of simulation results, the
performance of the proposed decentralised schemes are derived in terms of average EE evaluation
and found to be better than existing algorithms
- …