1,259 research outputs found
User Activity Detection and Channel Estimation for Grant-Free Random Access in LEO Satellite-Enabled Internet-of-Things
With recent advances on the dense low-earth orbit (LEO) constellation, LEO
satellite network has become one promising solution to providing global
coverage for Internet-of-Things (IoT) services. Confronted with the sporadic
transmission from randomly activated IoT devices, we consider the random access
(RA) mechanism, and propose a grant-free RA (GF-RA) scheme to reduce the access
delay to the mobile LEO satellites. A Bernoulli-Rician message passing with
expectation maximization (BR-MP-EM) algorithm is proposed for this
terrestrial-satellite GF-RA system to address the user activity detection (UAD)
and channel estimation (CE) problem. This BR-MP-EM algorithm is divided into
two stages. In the inner iterations, the Bernoulli messages and Rician messages
are updated for the joint UAD and CE problem. Based on the output of the inner
iterations, the expectation maximization (EM) method is employed in the outer
iterations to update the hyper-parameters related to the channel impairments.
Finally, simulation results show the UAD and CE accuracy of the proposed
BR-MP-EM algorithm, as well as the robustness against the channel impairments.Comment: 14 pages, 9 figures, accepted by Internet of Things Journa
LEO Satellite-Enabled Grant-Free Random Access with MIMO-OTFS
This paper investigates joint channel estimation and device activity
detection in the LEO satellite-enabled grant-free random access systems with
large differential delay and Doppler shift. In addition, the multiple-input
multiple-output (MIMO) with orthogonal time-frequency space modulation (OTFS)
is utilized to combat the dynamics of the terrestrial-satellite link. To
simplify the computation process, we estimate the channel tensor in parallel
along the delay dimension. Then, the deep learning and expectation-maximization
approach are integrated into the generalized approximate message passing with
cross-correlation--based Gaussian prior to capture the channel sparsity in the
delay-Doppler-angle domain and learn the hyperparameters. Finally, active
devices are detected by computing energy of the estimated channel. Simulation
results demonstrate that the proposed algorithms outperform conventional
methods.Comment: This paper has been accepted for presentation at the IEEE GLOBECOM
2022. arXiv admin note: text overlap with arXiv:2202.1305
Active Terminal Identification, Channel Estimation, and Signal Detection for Grant-Free NOMA-OTFS in LEO Satellite Internet-of-Things
This paper investigates the massive connectivity of low Earth orbit (LEO)
satellite-based Internet-of-Things (IoT) for seamless global coverage. We
propose to integrate the grant-free non-orthogonal multiple access (GF-NOMA)
paradigm with the emerging orthogonal time frequency space (OTFS) modulation to
accommodate the massive IoT access, and mitigate the long round-trip latency
and severe Doppler effect of terrestrial-satellite links (TSLs). On this basis,
we put forward a two-stage successive active terminal identification (ATI) and
channel estimation (CE) scheme as well as a low-complexity multi-user signal
detection (SD) method. Specifically, at the first stage, the proposed training
sequence aided OTFS (TS-OTFS) data frame structure facilitates the joint ATI
and coarse CE, whereby both the traffic sparsity of terrestrial IoT terminals
and the sparse channel impulse response are leveraged for enhanced performance.
Moreover, based on the single Doppler shift property for each TSL and sparsity
of delay-Doppler domain channel, we develop a parametric approach to further
refine the CE performance. Finally, a least square based parallel time domain
SD method is developed to detect the OTFS signals with relatively low
complexity. Simulation results demonstrate the superiority of the proposed
methods over the state-of-the-art solutions in terms of ATI, CE, and SD
performance confronted with the long round-trip latency and severe Doppler
effect.Comment: 20 pages, 9 figures, accepted by IEEE Transactions on Wireless
Communication
Quasi-Synchronous Random Access for Massive MIMO-Based LEO Satellite Constellations
Low earth orbit (LEO) satellite constellation-enabled communication networks
are expected to be an important part of many Internet of Things (IoT)
deployments due to their unique advantage of providing seamless global
coverage. In this paper, we investigate the random access problem in massive
multiple-input multiple-output-based LEO satellite systems, where the
multi-satellite cooperative processing mechanism is considered. Specifically,
at edge satellite nodes, we conceive a training sequence padded multi-carrier
system to overcome the issue of imperfect synchronization, where the training
sequence is utilized to detect the devices' activity and estimate their
channels. Considering the inherent sparsity of terrestrial-satellite links and
the sporadic traffic feature of IoT terminals, we utilize the orthogonal
approximate message passing-multiple measurement vector algorithm to estimate
the delay coefficients and user terminal activity. To further utilize the
structure of the receive array, a two-dimensional estimation of signal
parameters via rotational invariance technique is performed for enhancing
channel estimation. Finally, at the central server node, we propose a majority
voting scheme to enhance activity detection by aggregating backhaul information
from multiple satellites. Moreover, multi-satellite cooperative linear data
detection and multi-satellite cooperative Bayesian dequantization data
detection are proposed to cope with perfect and quantized backhaul,
respectively. Simulation results verify the effectiveness of our proposed
schemes in terms of channel estimation, activity detection, and data detection
for quasi-synchronous random access in satellite systems.Comment: 38 pages, 16 figures. This paper has been accepted by IEEE JSAC SI on
3GPP Technologies: 5G-Advanced and Beyond. Copyright may be transferred
without notice, after which this version may no longer be accessibl
Five Facets of 6G: Research Challenges and Opportunities
Whilst the fifth-generation (5G) systems are being rolled out across the
globe, researchers have turned their attention to the exploration of radical
next-generation solutions. At this early evolutionary stage we survey five main
research facets of this field, namely {\em Facet~1: next-generation
architectures, spectrum and services, Facet~2: next-generation networking,
Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing,
as well as Facet~5: applications of deep learning in 6G networks.} In this
paper, we have provided a critical appraisal of the literature of promising
techniques ranging from the associated architectures, networking, applications
as well as designs. We have portrayed a plethora of heterogeneous architectures
relying on cooperative hybrid networks supported by diverse access and
transmission mechanisms. The vulnerabilities of these techniques are also
addressed and carefully considered for highlighting the most of promising
future research directions. Additionally, we have listed a rich suite of
learning-driven optimization techniques. We conclude by observing the
evolutionary paradigm-shift that has taken place from pure single-component
bandwidth-efficiency, power-efficiency or delay-optimization towards
multi-component designs, as exemplified by the twin-component ultra-reliable
low-latency mode of the 5G system. We advocate a further evolutionary step
towards multi-component Pareto optimization, which requires the exploration of
the entire Pareto front of all optiomal solutions, where none of the components
of the objective function may be improved without degrading at least one of the
other components
Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G
The next wave of wireless technologies is proliferating in connecting things
among themselves as well as to humans. In the era of the Internet of things
(IoT), billions of sensors, machines, vehicles, drones, and robots will be
connected, making the world around us smarter. The IoT will encompass devices
that must wirelessly communicate a diverse set of data gathered from the
environment for myriad new applications. The ultimate goal is to extract
insights from this data and develop solutions that improve quality of life and
generate new revenue. Providing large-scale, long-lasting, reliable, and near
real-time connectivity is the major challenge in enabling a smart connected
world. This paper provides a comprehensive survey on existing and emerging
communication solutions for serving IoT applications in the context of
cellular, wide-area, as well as non-terrestrial networks. Specifically,
wireless technology enhancements for providing IoT access in fifth-generation
(5G) and beyond cellular networks, and communication networks over the
unlicensed spectrum are presented. Aligned with the main key performance
indicators of 5G and beyond 5G networks, we investigate solutions and standards
that enable energy efficiency, reliability, low latency, and scalability
(connection density) of current and future IoT networks. The solutions include
grant-free access and channel coding for short-packet communications,
non-orthogonal multiple access, and on-device intelligence. Further, a vision
of new paradigm shifts in communication networks in the 2030s is provided, and
the integration of the associated new technologies like artificial
intelligence, non-terrestrial networks, and new spectra is elaborated. Finally,
future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&
Compressive Sensing-Based Grant-Free Massive Access for 6G Massive Communication
The advent of the sixth-generation (6G) of wireless communications has given
rise to the necessity to connect vast quantities of heterogeneous wireless
devices, which requires advanced system capabilities far beyond existing
network architectures. In particular, such massive communication has been
recognized as a prime driver that can empower the 6G vision of future
ubiquitous connectivity, supporting Internet of Human-Machine-Things for which
massive access is critical. This paper surveys the most recent advances toward
massive access in both academic and industry communities, focusing primarily on
the promising compressive sensing-based grant-free massive access paradigm. We
first specify the limitations of existing random access schemes and reveal that
the practical implementation of massive communication relies on a dramatically
different random access paradigm from the current ones mainly designed for
human-centric communications. Then, a compressive sensing-based grant-free
massive access roadmap is presented, where the evolutions from single-antenna
to large-scale antenna array-based base stations, from single-station to
cooperative massive multiple-input multiple-output systems, and from unsourced
to sourced random access scenarios are detailed. Finally, we discuss the key
challenges and open issues to shed light on the potential future research
directions of grant-free massive access.Comment: Accepted by IEEE IoT Journa
NB-IoT via non terrestrial networks
Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable.
To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments.
In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided.
Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases.
The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency
- …