39 research outputs found
Decentralized random energy allocation for massive non-orthogonal code-division multiple access
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.This work studies the spectral efficiency achievable when a very large number of terminals are connected simultaneously to a central node (uplink) through independent and identically-distributed flat-fading channels. Assuming that terminals only have statistical channel state information (CSI), the optimum random transmitted-energy allocation is formulated considering a non-orthogonal direct-sequence code-division multiple access (DS-CDMA) where all users transmit using the same modulation and error correcting code and the receiver implements successive interference cancellation (SIC). Focusing on low-power terminals, optimization is carried out by imposing constraints on both the average and peak peruser transmitted energy. Simulations have revealed that a limited number of random energy levels, whose number is determined by the channel power gain variance, is sufficient to achieve approximately the maximum spectral efficiency that would be obtained under direct optimization of the received energy profile.Peer ReviewedPostprint (author's final draft
Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G
The next wave of wireless technologies is proliferating in connecting things
among themselves as well as to humans. In the era of the Internet of things
(IoT), billions of sensors, machines, vehicles, drones, and robots will be
connected, making the world around us smarter. The IoT will encompass devices
that must wirelessly communicate a diverse set of data gathered from the
environment for myriad new applications. The ultimate goal is to extract
insights from this data and develop solutions that improve quality of life and
generate new revenue. Providing large-scale, long-lasting, reliable, and near
real-time connectivity is the major challenge in enabling a smart connected
world. This paper provides a comprehensive survey on existing and emerging
communication solutions for serving IoT applications in the context of
cellular, wide-area, as well as non-terrestrial networks. Specifically,
wireless technology enhancements for providing IoT access in fifth-generation
(5G) and beyond cellular networks, and communication networks over the
unlicensed spectrum are presented. Aligned with the main key performance
indicators of 5G and beyond 5G networks, we investigate solutions and standards
that enable energy efficiency, reliability, low latency, and scalability
(connection density) of current and future IoT networks. The solutions include
grant-free access and channel coding for short-packet communications,
non-orthogonal multiple access, and on-device intelligence. Further, a vision
of new paradigm shifts in communication networks in the 2030s is provided, and
the integration of the associated new technologies like artificial
intelligence, non-terrestrial networks, and new spectra is elaborated. Finally,
future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&
Hybrid generalized non-orthogonal multiple access for the 5G wireless networks.
Master of Science in Computer Engineering. University of KwaZulu-Natal. Durban, 2018.The deployment of 5G networks will lead to an increase in capacity, spectral efficiency, low latency
and massive connectivity for wireless networks. They will still face the challenges of resource and
power optimization, increasing spectrum efficiency and energy optimization, among others.
Furthermore, the standardized technologies to mitigate against the challenges need to be developed
and are a challenge themselves. In the current predecessor LTE-A networks, orthogonal frequency
multiple access (OFDMA) scheme is used as the baseline multiple access scheme. It allows users to
be served orthogonally in either time or frequency to alleviate narrowband interference and impulse
noise. Further spectrum limitations of orthogonal multiple access (OMA) schemes have resulted in
the development of non-orthogonal multiple access (NOMA) schemes to enable 5G networks to
achieve high spectral efficiency and high data rates. NOMA schemes unorthogonally co-multiplex
different users on the same resource elements (RE) (i.e. time-frequency domain, OFDMA subcarrier,
or spreading code) via power domain (PD) or code domain (CD) at the transmitter and successfully
separating them at the receiver by applying multi-user detection (MUD) algorithms. The current
developed NOMA schemes, refered to as generalized-NOMA (G-NOMA) technologies includes;
Interleaver Division Multiple Access (IDMA, Sparse code multiple access (SCMA), Low-density
spreading multiple access (LDSMA), Multi-user shared access (MUSA) scheme and the Pattern
Division Multiple Access (PDMA). These protocols are currently still under refinement, their
performance and applicability has not been thoroughly investigated. The first part of this work
undertakes a thorough investigation and analysis of the performance of the existing G-NOMA
schemes and their applicability.
Generally, G-NOMA schemes perceives overloading by non-orthogonal spectrum resource
allocation, which enables massive connectivity of users and devices, and offers improved system
spectral efficiency. Like any other technologies, the G-NOMA schemes need to be improved to
further harvest their benefits on 5G networks leading to the requirement of Hybrid G-NOMA
(G-NOMA) schemes. The second part of this work develops a HG-NOMA scheme to alleviate the
5G challenges of resource allocation, inter and cross-tier interference management and energy
efficiency. This work develops and investigates the performance of an Energy Efficient HG-NOMA
resource allocation scheme for a two-tier heterogeneous network that alleviates the cross-tier
interference and improves the system throughput via spectrum resource optimization. By considering
the combinatorial problem of resource pattern assignment and power allocation, the HG-NOMA
scheme will enable a new transmission policy that allows more than two macro-user equipment’s
(MUEs) and femto-user equipment’s (FUEs) to be co-multiplexed on the same time-frequency RE
increasing the spectral efficiency. The performance of the developed model is shown to be superior to
the PD-NOMA and OFDMA schemes
A Tutorial on Interference Exploitation via Symbol-Level Precoding: Overview, State-of-the-Art and Future Directions
IEEE Interference is traditionally viewed as a performance limiting factor in wireless communication systems, which is to be minimized or mitigated. Nevertheless, a recent line of work has shown that by manipulating the interfering signals such that they add up constructively at the receiver side, known interference can be made beneficial and further improve the system performance in a variety of wireless scenarios, achieved by symbol-level precoding (SLP). This paper aims to provide a tutorial on interference exploitation techniques from the perspective of precoding design in a multi-antenna wireless communication system, by beginning with the classification of constructive interference (CI) and destructive interference (DI). The definition for CI is presented and the corresponding mathematical characterization is formulated for popular modulation types, based on which optimization-based precoding techniques are discussed. In addition, the extension of CI precoding to other application scenarios as well as for hardware efficiency is also described. Proof-of-concept testbeds are demonstrated for the potential practical implementation of CI precoding, and finally a list of open problems and practical challenges are presented to inspire and motivate further research directions in this area