28 research outputs found
Signal Processing and Learning for Next Generation Multiple Access in 6G
Wireless communication systems to date primarily rely on the orthogonality of
resources to facilitate the design and implementation, from user access to data
transmission. Emerging applications and scenarios in the sixth generation (6G)
wireless systems will require massive connectivity and transmission of a deluge
of data, which calls for more flexibility in the design concept that goes
beyond orthogonality. Furthermore, recent advances in signal processing and
learning have attracted considerable attention, as they provide promising
approaches to various complex and previously intractable problems of signal
processing in many fields. This article provides an overview of research
efforts to date in the field of signal processing and learning for
next-generation multiple access, with an emphasis on massive random access and
non-orthogonal multiple access. The promising interplay with new technologies
and the challenges in learning-based NGMA are discussed
Five Facets of 6G: Research Challenges and Opportunities
Whilst the fifth-generation (5G) systems are being rolled out across the
globe, researchers have turned their attention to the exploration of radical
next-generation solutions. At this early evolutionary stage we survey five main
research facets of this field, namely {\em Facet~1: next-generation
architectures, spectrum and services, Facet~2: next-generation networking,
Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing,
as well as Facet~5: applications of deep learning in 6G networks.} In this
paper, we have provided a critical appraisal of the literature of promising
techniques ranging from the associated architectures, networking, applications
as well as designs. We have portrayed a plethora of heterogeneous architectures
relying on cooperative hybrid networks supported by diverse access and
transmission mechanisms. The vulnerabilities of these techniques are also
addressed and carefully considered for highlighting the most of promising
future research directions. Additionally, we have listed a rich suite of
learning-driven optimization techniques. We conclude by observing the
evolutionary paradigm-shift that has taken place from pure single-component
bandwidth-efficiency, power-efficiency or delay-optimization towards
multi-component designs, as exemplified by the twin-component ultra-reliable
low-latency mode of the 5G system. We advocate a further evolutionary step
towards multi-component Pareto optimization, which requires the exploration of
the entire Pareto front of all optiomal solutions, where none of the components
of the objective function may be improved without degrading at least one of the
other components
Infinite Factorial Finite State Machine for Blind Multiuser Channel Estimation
New communication standards need to deal with machine-to-machine
communications, in which users may start or stop transmitting at any time in an
asynchronous manner. Thus, the number of users is an unknown and time-varying
parameter that needs to be accurately estimated in order to properly recover
the symbols transmitted by all users in the system. In this paper, we address
the problem of joint channel parameter and data estimation in a multiuser
communication channel in which the number of transmitters is not known. For
that purpose, we develop the infinite factorial finite state machine model, a
Bayesian nonparametric model based on the Markov Indian buffet that allows for
an unbounded number of transmitters with arbitrary channel length. We propose
an inference algorithm that makes use of slice sampling and particle Gibbs with
ancestor sampling. Our approach is fully blind as it does not require a prior
channel estimation step, prior knowledge of the number of transmitters, or any
signaling information. Our experimental results, loosely based on the LTE
random access channel, show that the proposed approach can effectively recover
the data-generating process for a wide range of scenarios, with varying number
of transmitters, number of receivers, constellation order, channel length, and
signal-to-noise ratio.Comment: 15 pages, 15 figure
6G White Paper on Machine Learning in Wireless Communication Networks
The focus of this white paper is on machine learning (ML) in wireless
communications. 6G wireless communication networks will be the backbone of the
digital transformation of societies by providing ubiquitous, reliable, and
near-instant wireless connectivity for humans and machines. Recent advances in
ML research has led enable a wide range of novel technologies such as
self-driving vehicles and voice assistants. Such innovation is possible as a
result of the availability of advanced ML models, large datasets, and high
computational power. On the other hand, the ever-increasing demand for
connectivity will require a lot of innovation in 6G wireless networks, and ML
tools will play a major role in solving problems in the wireless domain. In
this paper, we provide an overview of the vision of how ML will impact the
wireless communication systems. We first give an overview of the ML methods
that have the highest potential to be used in wireless networks. Then, we
discuss the problems that can be solved by using ML in various layers of the
network such as the physical layer, medium access layer, and application layer.
Zero-touch optimization of wireless networks using ML is another interesting
aspect that is discussed in this paper. Finally, at the end of each section,
important research questions that the section aims to answer are presented
Frequency synchronization in multiuser OFDM-IDMA systems.
Thesis (M.Sc.Eng.)-University of KwaZulu-Natal, Durban, 2013.Various multiuser schemes have been proposed to efficiently utilize the available bandwidth while ensuring an acceptable service delivery and flexibility. The multicarrier CDMA became an attractive solution to the major challenges confronting the wireless communication system. However, the scheme is plagued with multiple access interference (MAI), which causes conspicuous performance deterioration at the receiver. A low-complexity multiuser scheme called the Interleave Division Multiple Access (IDMA) was proposed recently as a capable solution to the drawback in the multicarrier CDMA scheme. A combined scheme of OFDM-IDMA was later introduced to enhance the performance of the earlier proposed IDMA scheme. The multicarrier IDMA scheme therefore combats inter-symbol interference (ISI) and MAI effectively over multipath with low complexity while ensuring a better cellular performance, high diversity order, and spectral efficiency.
Major studies on the OFDM-IDMA scheme emphasis only on the implementation of the scheme in a perfect scenario, where there are no synchronization errors in the system. Like other multicarrier schemes, the OFDM-IDMA scheme however suffers from carrier frequency offset (CFO) errors, which is inherent in the OFDM technique. This research work therefore examines, and analyzes the effect of synchronization errors on the performance of the new OFDM-based hybrid scheme called the OFDM-IDMA. The design of the OFDM-IDMA system developed is such that the cyclic prefix duration of the OFDM component is longer than the maximum channel delay spread of the multipath channel model used. This effectively eliminates ISI as well as timing offsets in the system. Since much work has not been done hitherto to address the deteriorating effect of synchronization errors on the OFDM-IDMA system, this research work therefore focuses on the more challenging issue of carrier frequency synchronization at the uplink.
A linear MMSE-based synchronization algorithm is proposed and implemented. The proposed algorithm is a non-data aided method that focuses on the mitigation of the ICI induced by the residual CFOs due to concurrent users in the multicarrier system. However, to obtain a better and improved system performance, the Kernel Least Mean Square (KLMS) algorithm and the normalized KLMS are proposed, implemented, and effectively adapted to combat the degrading influence of carrier frequency offset errors on the OFDM-IDMA scheme. The KLMS synchronization algorithm, which involves the execution of the conventional Least Mean Square (LMS) algorithm in the kernel space, utilizes the modulated input signal in the implementation of the kernel function, thereby enhancing the efficacy of the algorithm and the overall output of the multicarrier system.
The algorithms are applied in a Rayleigh fading multipath channel with varying mobile speed to verify their effectiveness and to clearly demonstrate their influence on the performance of the system in a practical scenario. Also, the implemented algorithms are compared to ascertain which of these algorithms offers a better and more efficient system performance. Computer simulations of the bit error performance of the algorithms are presented to verify their respective influence on the overall output of the multicarrier system. Simulation results of the algorithms in both slow fading and fast fading multipath scenarios are documented as well
Analysis and Design of Non-Orthogonal Multiple Access (NOMA) Techniques for Next Generation Wireless Communication Systems
The current surge in wireless connectivity, anticipated to amplify significantly in future wireless technologies, brings a new wave of users. Given the impracticality of an endlessly expanding bandwidth, there’s a pressing need for communication techniques that efficiently serve this burgeoning user base with limited resources. Multiple Access (MA) techniques, notably Orthogonal Multiple Access (OMA), have long addressed bandwidth constraints. However, with escalating user numbers, OMA’s orthogonality becomes limiting for emerging wireless technologies. Non-Orthogonal Multiple Access (NOMA), employing superposition coding, serves more users within the same bandwidth as OMA by allocating different power levels to users whose signals can then be detected using the gap between them, thus offering superior spectral efficiency and massive connectivity. This thesis examines the integration of NOMA techniques with cooperative relaying, EXtrinsic Information Transfer (EXIT) chart analysis, and deep learning for enhancing 6G and beyond communication systems. The adopted methodology aims to optimize the systems’ performance, spanning from bit-error rate (BER) versus signal to noise ratio (SNR) to overall system efficiency and data rates. The primary focus of this thesis is the investigation of the integration of NOMA with cooperative relaying, EXIT chart analysis, and deep learning techniques. In the cooperative relaying context, NOMA notably improved diversity gains, thereby proving the superiority of combining NOMA with cooperative relaying over just NOMA. With EXIT chart analysis, NOMA achieved low BER at mid-range SNR as well as achieved optimal user fairness in the power allocation stage. Additionally, employing a trained neural network enhanced signal detection for NOMA in the deep learning scenario, thereby producing a simpler signal detection for NOMA which addresses NOMAs’ complex receiver problem
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
A Tutorial on Decoding Techniques of Sparse Code Multiple Access
Sparse Code Multiple Access (SCMA) is a disruptive code-domain non-orthogonal multiple access (NOMA) scheme to enable future massive machine-type communication networks. As an evolved variant of code division multiple access (CDMA), multiple users in SCMA are separated by assigning distinctive sparse codebooks (CBs). Efficient multiuser detection is carried out at the receiver by employing the message passing algorithm (MPA) that exploits the sparsity of CBs to achieve error performance approaching to that of the maximum likelihood receiver. In spite of numerous research efforts in recent years, a comprehensive one-stop tutorial of SCMA covering the background, the basic principles, and new advances, is still missing, to the best of our knowledge. To fill this gap and to stimulate more forthcoming research, we provide a holistic introduction to the principles of SCMA encoding, CB design, and MPA based decoding in a self-contained manner. As an ambitious paper aiming to push the limits of SCMA, we present a survey of advanced decoding techniques with brief algorithmic descriptions as well as several promising directions