11 research outputs found
Partially Blind Handovers for mmWave New Radio Aided by Sub-6 GHz LTE Signaling
For a base station that supports cellular communications in sub-6 GHz LTE and
millimeter (mmWave) bands, we propose a supervised machine learning algorithm
to improve the success rate in the handover between the two radio frequencies
using sub-6 GHz and mmWave prior channel measurements within a temporal window.
The main contributions of our paper are to 1) introduce partially blind
handovers, 2) employ machine learning to perform handover success predictions
from sub-6 GHz to mmWave frequencies, and 3) show that this machine learning
based algorithm combined with partially blind handovers can improve the
handover success rate in a realistic network setup of colocated cells.
Simulation results show improvement in handover success rates for our proposed
algorithm compared to standard handover algorithms.Comment: (c) 2018 IEEE. Personal use of this material is permitted. Permission
from IEEE must be obtained for all other uses, in any current or future
media, including reprinting/republishing this material for advertising or
promotional purposes, creating new collective works, for resale or
redistribution to servers or lists, or reuse of any copyrighted component of
this work in other work
Deep Q-Learning for Self-Organizing Networks Fault Management and Radio Performance Improvement
We propose an algorithm to automate fault management in an outdoor cellular
network using deep reinforcement learning (RL) against wireless impairments.
This algorithm enables the cellular network cluster to self-heal by allowing RL
to learn how to improve the downlink signal to interference plus noise ratio
through exploration and exploitation of various alarm corrective actions. The
main contributions of this paper are to 1) introduce a deep RL-based fault
handling algorithm which self-organizing networks can implement in a polynomial
runtime and 2) show that this fault management method can improve the radio
link performance in a realistic network setup. Simulation results show that our
proposed algorithm learns an action sequence to clear alarms and improve the
performance in the cellular cluster better than existing algorithms, even
against the randomness of the network fault occurrences and user movements.Comment: (c) 2018 IEEE. Personal use of this material is permitted. Permission
from IEEE must be obtained for all other uses, in any current or future
media, including reprinting/republishing this material for advertising or
promotional purposes, creating new collective works, for resale or
redistribution to servers or lists, or reuse of any copyrighted component of
this work in other work
Blockage Prediction for Mobile UE in RIS-assisted Wireless Networks: A Deep Learning Approach
Due to significant blockage conditions in wireless networks, transmitted
signals may considerably degrade before reaching the receiver. The reliability
of the transmitted signals, therefore, may be critically problematic due to
blockages between the communicating nodes. Thanks to the ability of
Reconfigurable Intelligent Surfaces (RISs) to reflect the incident signals with
different reflection angles, this may counter the blockage effect by optimally
reflecting the transmit signals to receiving nodes, hence, improving the
wireless network's performance. With this motivation, this paper formulates a
RIS-aided wireless communication problem from a base station (BS) to a mobile
user equipment (UE). The BS is equipped with an RGB camera. We use the RGB
camera at the BS and the RIS panel to improve the system's performance while
considering signal propagating through multiple paths and the Doppler spread
for the mobile UE. First, the RGB camera is used to detect the presence of the
UE with no blockage. When unsuccessful, the RIS-assisted gain takes over and is
then used to detect if the UE is either "present but blocked" or "absent". The
problem is determined as a ternary classification problem with the goal of
maximizing the probability of UE communication blockage detection. We find the
optimal solution for the probability of predicting the blockage status for a
given RGB image and RIS-assisted data rate using a deep neural learning model.
We employ the residual network 18-layer neural network model to find this
optimal probability of blockage prediction. Extensive simulation results reveal
that our proposed RIS panel-assisted model enhances the accuracy of
maximization of the blockage prediction probability problem by over 38\%
compared to the baseline scheme