206 research outputs found
Taming and Leveraging Directionality and Blockage in Millimeter Wave Communications
To cope with the challenge for high-rate data transmission, Millimeter Wave(mmWave) is one potential solution. The short wavelength unlatched the era of directional mobile communication. The semi-optical communication requires revolutionary thinking. To assist the research and evaluate various algorithms, we build a motion-sensitive mmWave testbed with two degrees of freedom for environmental sensing and general wireless communication.The first part of this thesis contains two approaches to maintain the connection in mmWave mobile communication. The first one seeks to solve the beam tracking problem using motion sensor within the mobile device. A tracking algorithm is given and integrated into the tracking protocol. Detailed experiments and numerical simulations compared several compensation schemes with optical benchmark and demonstrated the efficiency of overhead reduction. The second strategy attempts to mitigate intermittent connections during roaming is multi-connectivity. Taking advantage of properties of rateless erasure code, a fountain code type multi-connectivity mechanism is proposed to increase the link reliability with simplified backhaul mechanism. The simulation demonstrates the efficiency and robustness of our system design with a multi-link channel record.The second topic in this thesis explores various techniques in blockage mitigation. A fast hear-beat like channel with heavy blockage loss is identified in the mmWave Unmanned Aerial Vehicle (UAV) communication experiment due to the propeller blockage. These blockage patterns are detected through Holm\u27s procedure as a problem of multi-time series edge detection. To reduce the blockage effect, an adaptive modulation and coding scheme is designed. The simulation results show that it could greatly improve the throughput given appropriately predicted patterns. The last but not the least, the blockage of directional communication also appears as a blessing because the geometrical information and blockage event of ancillary signal paths can be utilized to predict the blockage timing for the current transmission path. A geometrical model and prediction algorithm are derived to resolve the blockage time and initiate active handovers. An experiment provides solid proof of multi-paths properties and the numeral simulation demonstrates the efficiency of the proposed algorithm
A Comprehensive Investigation of Beam Management Through Conventional and Deep Learning Approach
5G spectrum uses cutting-edge technology which delivers high data rates, low latency, increased capacity, and high spectrum utilization. To cater to these requirements various technologies are available such as Multiple Access Technology (MAT), Multiple Input Multiple Output technology (MIMO), Millimetre (mm) wave technology, Non-Orthogonal Multiple Access Technology (NOMA), Simultaneous Wireless Information and Power Transfer (SWIPT). Of all available technologies, mmWave is prominent as it provides favorable opportunities for 5G. Millimeter-wave is capable of providing a high data rate i.e., 10 Gbit/sec. Also, a tremendous amount of raw bandwidth is available i.e., around 250 GHz, which is an attractive characteristic of the mmWave band to relieve mobile data traffic congestion in the low frequency band. It has a high frequency i.e., 30 – 300 GHz, giving very high speed. It has a very short wavelength i.e., 1-10mm, because of this it provides the compact size of the component. It will provide a throughput of up to 20 Gbps. It has narrow beams and will increase security and reduce interference. When the main beam of the transmitter and receiver are not aligned properly there is a problem in ideal communication. To solve this problem beam management is one of the solutions to form a strong communication link between transmitter and receiver. This paper aims to address challenges in beam management and proposes a framework for realization. Towards the same, the paper initially introduces various challenges in beam management. Towards building an effective beam management system when a user is moving, various steps are present like beam selection, beam tracking, beam alignment, and beam forming. Hence the subsequent sections of the paper illustrate various beam management procedures in mmWave using conventional methods as well as using deep learning techniques. The paper also presents a case study on the framework's implementation using the above-mentioned techniques in mmWave communication. Also glimpses on future research directions are detailed in the final sections. Such beam management techniques when used for mmWave technology will enable build fast, efficient, and capable 5G networks
A Vision and Framework for the High Altitude Platform Station (HAPS) Networks of the Future
A High Altitude Platform Station (HAPS) is a network node that operates in
the stratosphere at an of altitude around 20 km and is instrumental for
providing communication services. Precipitated by technological innovations in
the areas of autonomous avionics, array antennas, solar panel efficiency
levels, and battery energy densities, and fueled by flourishing industry
ecosystems, the HAPS has emerged as an indispensable component of
next-generations of wireless networks. In this article, we provide a vision and
framework for the HAPS networks of the future supported by a comprehensive and
state-of-the-art literature review. We highlight the unrealized potential of
HAPS systems and elaborate on their unique ability to serve metropolitan areas.
The latest advancements and promising technologies in the HAPS energy and
payload systems are discussed. The integration of the emerging Reconfigurable
Smart Surface (RSS) technology in the communications payload of HAPS systems
for providing a cost-effective deployment is proposed. A detailed overview of
the radio resource management in HAPS systems is presented along with
synergistic physical layer techniques, including Faster-Than-Nyquist (FTN)
signaling. Numerous aspects of handoff management in HAPS systems are
described. The notable contributions of Artificial Intelligence (AI) in HAPS,
including machine learning in the design, topology management, handoff, and
resource allocation aspects are emphasized. The extensive overview of the
literature we provide is crucial for substantiating our vision that depicts the
expected deployment opportunities and challenges in the next 10 years
(next-generation networks), as well as in the subsequent 10 years
(next-next-generation networks).Comment: To appear in IEEE Communications Surveys & Tutorial
Space-Air-Ground Integrated 6G Wireless Communication Networks: A Review of Antenna Technologies and Application Scenarios
A review of technological solutions and advances in the framework of a Vertical Heterogeneous Network (VHetNet) integrating satellite, airborne and terrestrial networks is presented. The disruptive features and challenges offered by a fruitful cooperation among these segments within a ubiquitous and seamless wireless connectivity are described. The available technologies and the key research directions for achieving global wireless coverage by considering all these layers are thoroughly discussed. Emphasis is placed on the available antenna systems in satellite, airborne and ground layers by highlighting strengths and weakness and by providing some interesting trends in research. A summary of the most suitable applicative scenarios for future 6G wireless communications are finally illustrated
A Comprehensive Overview on 5G-and-Beyond Networks with UAVs: From Communications to Sensing and Intelligence
Due to the advancements in cellular technologies and the dense deployment of
cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the
fifth-generation (5G) and beyond cellular networks is a promising solution to
achieve safe UAV operation as well as enabling diversified applications with
mission-specific payload data delivery. In particular, 5G networks need to
support three typical usage scenarios, namely, enhanced mobile broadband
(eMBB), ultra-reliable low-latency communications (URLLC), and massive
machine-type communications (mMTC). On the one hand, UAVs can be leveraged as
cost-effective aerial platforms to provide ground users with enhanced
communication services by exploiting their high cruising altitude and
controllable maneuverability in three-dimensional (3D) space. On the other
hand, providing such communication services simultaneously for both UAV and
ground users poses new challenges due to the need for ubiquitous 3D signal
coverage as well as the strong air-ground network interference. Besides the
requirement of high-performance wireless communications, the ability to support
effective and efficient sensing as well as network intelligence is also
essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting
aerial and ground users. In this paper, we provide a comprehensive overview of
the latest research efforts on integrating UAVs into cellular networks, with an
emphasis on how to exploit advanced techniques (e.g., intelligent reflecting
surface, short packet transmission, energy harvesting, joint communication and
radar sensing, and edge intelligence) to meet the diversified service
requirements of next-generation wireless systems. Moreover, we highlight
important directions for further investigation in future work.Comment: Accepted by IEEE JSA
Five Facets of 6G: Research Challenges and Opportunities
Whilst the fifth-generation (5G) systems are being rolled out across the
globe, researchers have turned their attention to the exploration of radical
next-generation solutions. At this early evolutionary stage we survey five main
research facets of this field, namely {\em Facet~1: next-generation
architectures, spectrum and services, Facet~2: next-generation networking,
Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing,
as well as Facet~5: applications of deep learning in 6G networks.} In this
paper, we have provided a critical appraisal of the literature of promising
techniques ranging from the associated architectures, networking, applications
as well as designs. We have portrayed a plethora of heterogeneous architectures
relying on cooperative hybrid networks supported by diverse access and
transmission mechanisms. The vulnerabilities of these techniques are also
addressed and carefully considered for highlighting the most of promising
future research directions. Additionally, we have listed a rich suite of
learning-driven optimization techniques. We conclude by observing the
evolutionary paradigm-shift that has taken place from pure single-component
bandwidth-efficiency, power-efficiency or delay-optimization towards
multi-component designs, as exemplified by the twin-component ultra-reliable
low-latency mode of the 5G system. We advocate a further evolutionary step
towards multi-component Pareto optimization, which requires the exploration of
the entire Pareto front of all optiomal solutions, where none of the components
of the objective function may be improved without degrading at least one of the
other components
Contextual Beamforming: Exploiting Location and AI for Enhanced Wireless Telecommunication Performance
The pervasive nature of wireless telecommunication has made it the foundation
for mainstream technologies like automation, smart vehicles, virtual reality,
and unmanned aerial vehicles. As these technologies experience widespread
adoption in our daily lives, ensuring the reliable performance of cellular
networks in mobile scenarios has become a paramount challenge. Beamforming, an
integral component of modern mobile networks, enables spatial selectivity and
improves network quality. However, many beamforming techniques are iterative,
introducing unwanted latency to the system. In recent times, there has been a
growing interest in leveraging mobile users' location information to expedite
beamforming processes. This paper explores the concept of contextual
beamforming, discussing its advantages, disadvantages and implications.
Notably, the study presents an impressive 53% improvement in signal-to-noise
ratio (SNR) by implementing the adaptive beamforming (MRT) algorithm compared
to scenarios without beamforming. It further elucidates how MRT contributes to
contextual beamforming. The importance of localization in implementing
contextual beamforming is also examined. Additionally, the paper delves into
the use of artificial intelligence schemes, including machine learning and deep
learning, in implementing contextual beamforming techniques that leverage user
location information. Based on the comprehensive review, the results suggest
that the combination of MRT and Zero forcing (ZF) techniques, alongside deep
neural networks (DNN) employing Bayesian Optimization (BO), represents the most
promising approach for contextual beamforming. Furthermore, the study discusses
the future potential of programmable switches, such as Tofino, in enabling
location-aware beamforming
- …