343 research outputs found
Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks
Soaring capacity and coverage demands dictate that future cellular networks
need to soon migrate towards ultra-dense networks. However, network
densification comes with a host of challenges that include compromised energy
efficiency, complex interference management, cumbersome mobility management,
burdensome signaling overheads and higher backhaul costs. Interestingly, most
of the problems, that beleaguer network densification, stem from legacy
networks' one common feature i.e., tight coupling between the control and data
planes regardless of their degree of heterogeneity and cell density.
Consequently, in wake of 5G, control and data planes separation architecture
(SARC) has recently been conceived as a promising paradigm that has potential
to address most of aforementioned challenges. In this article, we review
various proposals that have been presented in literature so far to enable SARC.
More specifically, we analyze how and to what degree various SARC proposals
address the four main challenges in network densification namely: energy
efficiency, system level capacity maximization, interference management and
mobility management. We then focus on two salient features of future cellular
networks that have not yet been adapted in legacy networks at wide scale and
thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and
device-to-device (D2D) communications. After providing necessary background on
CoMP and D2D, we analyze how SARC can particularly act as a major enabler for
CoMP and D2D in context of 5G. This article thus serves as both a tutorial as
well as an up to date survey on SARC, CoMP and D2D. Most importantly, the
article provides an extensive outlook of challenges and opportunities that lie
at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201
Recommended from our members
Improving next-generation wireless network performance and reliability with deep learning
A rudimentary question whether machine learning in general, or deep learning in particular, could add to the well-established field of wireless communications, which has been evolving for close to a century, is often raised. While the use of deep learning based methods is likely to help build intelligent wireless solutions, this use becomes particularly challenging for the lower layers in the wireless communication stack. The introduction of the fifth generation of wireless communications (5G) has triggered the demand for ânetwork intelligenceâ to support its promises for very high data rates and extremely low latency. Consequently, 5G wireless operators are faced with the challenges of network complexity, diversification of services, and personalized user experience. Industry standards have created enablers (such as the network data analytics function), but these enablers focus on post-mortem analysis at higher stack layers and have a periodicity in the time scale of seconds (or larger). The goal of this dissertation is to show a solution for these challenges and how a data-driven approach using deep learning could add to the field of wireless communications. In particular, I propose intelligent predictive and prescriptive abilities to boost reliability and eliminate performance bottlenecks in 5G cellular networks and beyond, show contributions that justify the value of deep learning in wireless communications across several different layers, and offer in-depth analysis and comparisons with baselines and industry standards. First, to improve multi-antenna network reliability against wireless impairments with power control and interference coordination for both packetized voice and beamformed data bearers, I propose the use of a joint beamforming, power control, and interference coordination algorithm based on deep reinforcement learning. This algorithm uses a string of bits and logic operations to enable simultaneous actions to be performed by the reinforcement learning agent. Consequently, a joint reward function is also proposed. I compare the performance of my proposed algorithm with the brute force approach and show that similar performance is achievable but with faster run-time as the number of transmit antennas increases. Second, in enhancing the performance of coordinated multipoint, I propose the use of deep learning binary classification to learn a surrogate function to trigger a second transmission stream instead of depending on the popular signal to interference plus noise measurement quantity. This surrogate function improves the users' sum-rate through focusing on pre-logarithmic terms in the sum-rate formula, which have larger impact on this rate. Third, performance of band switching can be improved without the need for a full channel estimation. My proposal of using deep learning to classify the quality of two frequency bands prior to granting the band switching leads to a significant improvement in users' throughput. This is due to the elimination of the industry standard measurement gap requirementâa period of silence where no data is sent to the users so they could measure the frequency bands before switching. In this dissertation, a group of algorithms for wireless network performance and reliability for downlink are proposed. My results show that the introduction of user coordinates enhance the accuracy of the predictions made with deep learning. Also, the choice of signal to interference plus noise ratio as the optimization objective may not always be the best choice to improve user throughput rates. Further, exploiting the spatial correlation of channels in different frequency bands can improve certain network procedures without the need for perfect knowledge of the per-band channel state information. Hence, an understanding of these results help develop novel solutions to enhancing these wireless networks at a much smaller time scale compared to the industry standards todayElectrical and Computer Engineerin
Cooperative Uplink Inter-Cell Interference (ICI) Mitigation in 5G Networks
In order to support the new paradigm shift in fifth generation (5G) mobile communication, radically different network architectures, associated technologies and network operation algorithms, need to be developed compared to existing fourth generation (4G) cellular solutions. The evolution toward 5G mobile networks will be characterized by an increasing number of wireless devices, increasing device and service complexity, and the requirement to access mobile services ubiquitously.
To realise the dramatic increase in data rates in particular, research is focused on improving the capacity of current, Long Term Evolution (LTE)-based, 4G network standards, before radical changes are exploited which could include acquiring additional spectrum. The LTE network has a reuse factor of one; hence neighbouring cells/sectors use the same spectrum, therefore making the cell-edge users vulnerable to heavy inter cell interference in addition to the other factors such as fading and path-loss. In this direction, this thesis focuses on improving the performance of cell-edge users in LTE and LTE-Advanced networks by initially implementing a new Coordinated Multi-Point (CoMP) technique to support future 5G networks using smart antennas to mitigate cell-edge user interference in uplink. Successively a novel cooperative uplink inter-cell interference mitigation algorithm based on joint reception at the base station using receiver adaptive beamforming is investigated. Subsequently interference mitigation in a heterogeneous environment for inter Device-to-Device (D2D) communication underlaying cellular network is investigated as the enabling technology for maximising resource block (RB) utilisation in emerging 5G networks. The proximity of users in a network, achieving higher data rates with maximum RB utilisation (as the technology reuses the cellular RB simultaneously), while taking some load off the evolved Node B (eNodeB) i.e. by direct communication between User Equipment (UE), has been explored. Simulation results show that the proximity and transmission power of D2D transmission yields high performance gains for D2D receivers, which was demonstrated to be better than that of cellular UEs with better channel conditions or in close proximity to the eNodeB in the network. It is finally demonstrated that the application, as an extension to the above, of a novel receiver beamforming technique to reduce interference from D2D users, can further enhance network performance.
To be able to develop the aforementioned technologies and evaluate the performance of new algorithms in emerging network scenarios, a beyond the-state-of-the-art LTE system-level-simulator (SLS) was implemented. The new simulator includes Multiple-Input Multiple-Output (MIMO) antenna functionalities, comprehensive channel models (such as Wireless World initiative New Radio II i.e. WINNER II) and adaptive modulation and coding schemes to accurately emulate the LTE and LTE-A network standards
A Survey on Multi-AP Coordination Approaches over Emerging WLANs: Future Directions and Open Challenges
Recent advancements in wireless local area network (WLAN) technology include
IEEE 802.11be and 802.11ay, often known as Wi-Fi 7 and WiGig, respectively. The
goal of these developments is to provide Extremely High Throughput (EHT) and
low latency to meet the demands of future applications like as 8K videos,
augmented and virtual reality, the Internet of Things, telesurgery, and other
developing technologies. IEEE 802.11be includes new features such as 320 MHz
bandwidth, multi-link operation, Multi-user Multi-Input Multi-Output,
orthogonal frequency-division multiple access, and Multiple-Access Point
(multi-AP) coordination (MAP-Co) to achieve EHT. With the increase in the
number of overlapping APs and inter-AP interference, researchers have focused
on studying MAP-Co approaches for coordinated transmission in IEEE 802.11be,
making MAP-Co a key feature of future WLANs. Moreover, similar issues may arise
in EHF bands WLAN, particularly for standards beyond IEEE 802.11ay. This has
prompted researchers to investigate the implementation of MAP-Co over future
802.11ay WLANs. Thus, in this article, we provide a comprehensive review of the
state-of-the-art MAP-Co features and their shortcomings concerning emerging
WLAN. Finally, we discuss several novel future directions and open challenges
for MAP-Co.Comment: The reason for the replacement of the previous version of the paper
is due to a change in the author's list. As a result, a new version has been
created, which serves as the final draft version before acceptance. This
updated version contains all the latest changes and improvements made to the
pape
Recommended from our members
Neural network design for intelligent mobile network optimisation
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe mobile networks usersâ demands for data services are increasing exponentially, this is due to two main factors: the first is the evolution of smart phones and their application, and the second is the emerging new technologies for internet of things, smart citiesâŠetc, which keeps pumping more data into the network; âthough most of the data routed in the current mobile network is non-live dataâ. This increasing of demands arise the necessity for the mobile network operators to keep improving their network to satisfy it, this improvement takes place via adding hardware or increasing the resources or a combination of both. The radio resources are strictly limited due to spectrum licensing and availability, therefore efficient spectrum utilization is a major goal to be achieved for both network operators and developers. Simultaneous and multiple channel access,and adding more cells to the network are ways used to increase the data exchanged between the network nodes. The current 4G mobile system is based on the Orthogonal Frequency Division Multiple Access (OFDMA) for accessing the medium and the intercell interference degrades the link quality at the cell edge, with the introduction of heterogeneity concept to the LTE in Release 10 of the 3GPP the handover process became even more complex. To mitigate the intercell interference at the cell edge, coordinated multipoint and carrier aggregation techniques are utilized for dual connectivity. This work is focused on designing and proposing enhancing features to improve network performance and sustainability, these features comprises of distributing small cells for data only transmission, handover schemes performance evaluation at cell edge with dual connectivity, and Artificial Intelligence technology for balancing and prediction. In the proposed model design the data and controls of the Small eNodeB (SeNodeB) are processed at the network edge using a Mobile Edge Computing (MEC) server and the SeNodeBs are used to boost services provided to the users, also the concept of caching data has been investigated, the caching units where implemented in different network levels. The proposed system and resource management are simulated using the OPNET modeller and evaluated through multiple scenarios with and without full load, the UE is reconfigured to accommodate dual connectivity and have two separate connections for uplink and downlink, while maintaining connection to the Macro cell via uplink, the downlink is dedicated for small cells when content is requested from the cache. The results clearly show that the proposed system can decrease the latency while the total throughput delivered by the network has highly improved when SeNodeBs are deployed in the system, rising throughput will incur the rise of overall capacity which leads to better services being provided to the users or more users to join and benefit from the network. Handover improvement is also considered in this work, with the help of two Artificial Intelligence (AI) entities better handover performance are achieved. Balanced load over the SeNodeBs results in less frequent handover, the proposed load balancer is based on artificial neural network clustering model with self-organizing map as a hidden layer, itâs trained to forecast the network condition and learn to reduce the number of handovers especially for the UEs at the cell edge by performing only necessary ones, and avoid handovers to the Macro cell for the downlink direction. The examined handovers concern the downlinks when routing non live video stored at the small cellâs cache, and a reduction in the frequent handovers was achieved when running the balancer. Keep revolving in the handover orbit, another way to preserve and utilize network resources is by predicting the handovers before they occur, and allocate the required data in the target SeNodeB, the predictor entity in the proposed system architecture combines the features of Radial Basis Function Neural Network and neural network time series tool to create and update prediction list from the systemâs collected data and learn to predict the next SeNodeB to associate with. The prediction entity is simulated using MATLAB, and the results shows that the system was able to deliver up to 92% correct predictions for handovers which led to overall throughput improvement of 75%
- âŠ