435 research outputs found
Millimeter-wave Wireless LAN and its Extension toward 5G Heterogeneous Networks
Millimeter-wave (mmw) frequency bands, especially 60 GHz unlicensed band, are
considered as a promising solution for gigabit short range wireless
communication systems. IEEE standard 802.11ad, also known as WiGig, is
standardized for the usage of the 60 GHz unlicensed band for wireless local
area networks (WLANs). By using this mmw WLAN, multi-Gbps rate can be achieved
to support bandwidth-intensive multimedia applications. Exhaustive search along
with beamforming (BF) is usually used to overcome 60 GHz channel propagation
loss and accomplish data transmissions in such mmw WLANs. Because of its short
range transmission with a high susceptibility to path blocking, multiple number
of mmw access points (APs) should be used to fully cover a typical target
environment for future high capacity multi-Gbps WLANs. Therefore, coordination
among mmw APs is highly needed to overcome packet collisions resulting from
un-coordinated exhaustive search BF and to increase the total capacity of mmw
WLANs. In this paper, we firstly give the current status of mmw WLANs with our
developed WiGig AP prototype. Then, we highlight the great need for coordinated
transmissions among mmw APs as a key enabler for future high capacity mmw
WLANs. Two different types of coordinated mmw WLAN architecture are introduced.
One is the distributed antenna type architecture to realize centralized
coordination, while the other is an autonomous coordination with the assistance
of legacy Wi-Fi signaling. Moreover, two heterogeneous network (HetNet)
architectures are also introduced to efficiently extend the coordinated mmw
WLANs to be used for future 5th Generation (5G) cellular networks.Comment: 18 pages, 24 figures, accepted, invited paper
Recommended from our members
Improving next-generation wireless network performance and reliability with deep learning
A rudimentary question whether machine learning in general, or deep learning in particular, could add to the well-established field of wireless communications, which has been evolving for close to a century, is often raised. While the use of deep learning based methods is likely to help build intelligent wireless solutions, this use becomes particularly challenging for the lower layers in the wireless communication stack. The introduction of the fifth generation of wireless communications (5G) has triggered the demand for “network intelligence” to support its promises for very high data rates and extremely low latency. Consequently, 5G wireless operators are faced with the challenges of network complexity, diversification of services, and personalized user experience. Industry standards have created enablers (such as the network data analytics function), but these enablers focus on post-mortem analysis at higher stack layers and have a periodicity in the time scale of seconds (or larger). The goal of this dissertation is to show a solution for these challenges and how a data-driven approach using deep learning could add to the field of wireless communications. In particular, I propose intelligent predictive and prescriptive abilities to boost reliability and eliminate performance bottlenecks in 5G cellular networks and beyond, show contributions that justify the value of deep learning in wireless communications across several different layers, and offer in-depth analysis and comparisons with baselines and industry standards. First, to improve multi-antenna network reliability against wireless impairments with power control and interference coordination for both packetized voice and beamformed data bearers, I propose the use of a joint beamforming, power control, and interference coordination algorithm based on deep reinforcement learning. This algorithm uses a string of bits and logic operations to enable simultaneous actions to be performed by the reinforcement learning agent. Consequently, a joint reward function is also proposed. I compare the performance of my proposed algorithm with the brute force approach and show that similar performance is achievable but with faster run-time as the number of transmit antennas increases. Second, in enhancing the performance of coordinated multipoint, I propose the use of deep learning binary classification to learn a surrogate function to trigger a second transmission stream instead of depending on the popular signal to interference plus noise measurement quantity. This surrogate function improves the users' sum-rate through focusing on pre-logarithmic terms in the sum-rate formula, which have larger impact on this rate. Third, performance of band switching can be improved without the need for a full channel estimation. My proposal of using deep learning to classify the quality of two frequency bands prior to granting the band switching leads to a significant improvement in users' throughput. This is due to the elimination of the industry standard measurement gap requirement—a period of silence where no data is sent to the users so they could measure the frequency bands before switching. In this dissertation, a group of algorithms for wireless network performance and reliability for downlink are proposed. My results show that the introduction of user coordinates enhance the accuracy of the predictions made with deep learning. Also, the choice of signal to interference plus noise ratio as the optimization objective may not always be the best choice to improve user throughput rates. Further, exploiting the spatial correlation of channels in different frequency bands can improve certain network procedures without the need for perfect knowledge of the per-band channel state information. Hence, an understanding of these results help develop novel solutions to enhancing these wireless networks at a much smaller time scale compared to the industry standards todayElectrical and Computer Engineerin
A survey of machine learning techniques applied to self organizing cellular networks
In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future
Towards Wireless Virtualization for 5G Cellular Systems
Although it has been defined as one of the most promising key enabling technologies for the forthcoming fifth generation cellular networks, wireless virtualization still has several challenges remaining to be addressed. Amongst those, resource allocation, which decides how to embed the different wireless virtual networks on the physical relying infrastructure, is the one receiving maximum attention. This project aims at finding the optimal resource allocation for each virtual network, in terms of channel resources, power levels and radio access technologies so that the data rate requested by each virtual network can be guaranteed and the global throughput efficiency can be maximized.Aunque haya sido definida como una de las tecnologías clave para el desarrollo de la nueva generación de sistemas móviles, la virtualización del acceso radio aún tiene muchos retos a investigar. Entre ellos, la distribución de los recursos, que tiene por objetivo encontrar el mejor encaje de las distintas redes virtuales en la infraestructura física que comparten, es el que está recibiendo la mayor atención. Este proyecto, tiene por objetivo encontrar la repartición óptima de los recursos, tanto a nivel de canal como de potencia y de tecnologías de acceso radio, para que los requisitos de las redes virtuales puedan ser garantizadas y la eficiencia global sea maximizada.Malgrat ha estat definida com una de les tecnologies claus de cara al desenvolupament de la propera cinquena generació de xarxes mòbils, la virtualització de l'accés radio encara té molts reptes oberts a fer front. Entre ells, la distribució de recursos, que té per objectiu buscar el millor encaix de les diferents xarxes virtuals en la infraestructura física que comparteixen, és la que està centrant la màxima atenció. Aquest projecte té per objectiu aconseguir la repartició òptima de recursos, pel que fa al canal, als nivells de potència i a les tecnologies radio disponibles, de manera que els requisits de cada xarxa virtual puguin ser garantits i que l'eficiència global pugui ser maximitzada
Study and application of machine learning techniques to the deployment of services on 5G optical networks
The vision of the future 5G corresponds to a highly heterogeneous network at different levels; the increment in the number of services requests for the 5G networks imposes several technical challenges. In the 5G context, in the recent years, several machine learning-based approaches have been demonstrated as useful tools for making easier the networks’ management, by considering that different unexpected events could make that the services cannot be satisfied at the moment they are requested. Such approaches are usually referred as cognitive network management. There are too many parameters inside the 5G network affecting each layer of the network; the virtualization and abstraction of the services is a crucial part for a satisfactory service deployment, being the monitoring and control of the different planes the two keys inside the cognitive network management. In this project it has been addressed the implementation of a simulated data collector as well as the study of several machine learning-based approaches. This way, possible future performance can be predicted, giving to the system the ability to change the initial parameters and to adapt the network to future demands
D4.3 Final Report on Network-Level Solutions
Research activities in METIS reported in this document focus on proposing solutions
to the network-level challenges of future wireless communication networks. Thereby, a large variety of scenarios is considered and a set of technical concepts is proposed to serve the needs envisioned for the 2020 and beyond.
This document provides the final findings on several network-level aspects and groups of
solutions that are considered essential for designing future 5G solutions. Specifically, it
elaborates on:
-Interference management and resource allocation schemes
-Mobility management and robustness enhancements
-Context aware approaches
-D2D and V2X mechanisms
-Technology components focused on clustering
-Dynamic reconfiguration enablers
These novel network-level technology concepts are evaluated against requirements defined
by METIS for future 5G systems. Moreover, functional enablers which can support the
solutions mentioned aboveare proposed.
We find that the network level solutions and technology components developed during the course of METIS complement the lower layer technology components and thereby effectively contribute to meeting 5G requirements and targets.Aydin, O.; Valentin, S.; Ren, Z.; Botsov, M.; Lakshmana, TR.; Sui, Y.; Sun, W.... (2015). D4.3 Final Report on Network-Level Solutions. http://hdl.handle.net/10251/7675
Scaling up virtual MIMO systems
Multiple-input multiple-output (MIMO) systems are a mature technology that has been incorporated
into current wireless broadband standards to improve the channel capacity and link
reliability. Nevertheless, due to the continuous increasing demand for wireless data traffic new
strategies are to be adopted. Very large MIMO antenna arrays represents a paradigm shift in
terms of theory and implementation, where the use of tens or hundreds of antennas provides
significant improvements in throughput and radiated energy efficiency compared to single antennas
setups. Since design constraints limit the number of usable antennas, virtual systems can
be seen as a promising technique due to their ability to mimic and exploit the gains of multi-antenna
systems by means of wireless cooperation. Considering these arguments, in this work,
energy efficient coding and network design for large virtual MIMO systems are presented.
Firstly, a cooperative virtual MIMO (V-MIMO) system that uses a large multi-antenna transmitter
and implements compress-and-forward (CF) relay cooperation is investigated. Since
constructing a reliable codebook is the most computationally complex task performed by the
relay nodes in CF cooperation, reduced complexity quantisation techniques are introduced. The
analysis is focused on the block error probability (BLER) and the computational complexity for
the uniform scalar quantiser (U-SQ) and the Lloyd-Max algorithm (LM-SQ). Numerical results
show that the LM-SQ is simpler to design and can achieve a BLER performance comparable to
the optimal vector quantiser. Furthermore, due to its low complexity, U-SQ could be consider
particularly suitable for very large wireless systems.
Even though very large MIMO systems enhance the spectral efficiency of wireless networks,
this comes at the expense of linearly increasing the power consumption due to the use of multiple
radio frequency chains to support the antennas. Thus, the energy efficiency and throughput
of the cooperative V-MIMO system are analysed and the impact of the imperfect channel state
information (CSI) on the system’s performance is studied. Finally, a power allocation algorithm
is implemented to reduce the total power consumption. Simulation results show that
wireless cooperation between users is more energy efficient than using a high modulation order
transmission and that the larger the number of transmit antennas the lower the impact of the
imperfect CSI on the system’s performance.
Finally, the application of cooperative systems is extended to wireless self-backhauling heterogeneous
networks, where the decode-and-forward (DF) protocol is employed to provide a
cost-effective and reliable backhaul. The associated trade-offs for a heterogeneous network
with inhomogeneous user distributions are investigated through the use of sleeping strategies.
Three different policies for switching-off base stations are considered: random, load-based and
greedy algorithms. The probability of coverage for the random and load-based sleeping policies
is derived. Moreover, an energy efficient base station deployment and operation approach
is presented. Numerical results show that the average number of base stations required to support
the traffic load at peak-time can be reduced by using the greedy algorithm for base station
deployment and that highly clustered networks exhibit a smaller average serving distance and
thus, a better probability of coverage
- …