628 research outputs found
Goodbye, ALOHA!
©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft
A survey of machine learning techniques applied to self organizing cellular networks
In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future
Cooperation techniques between LTE in unlicensed spectrum and Wi-Fi towards fair spectral efficiency
On the road towards 5G, a proliferation of Heterogeneous Networks (HetNets) is expected. Sensor networks are of great importance in this new wireless era, as they allow interaction with the environment. Additionally, the establishment of the Internet of Things (IoT) has incredibly increased the number of interconnected devices and consequently the already massive wirelessly transmitted traffic. The exponential growth of wireless traffic is pushing the wireless community to investigate solutions that maximally exploit the available spectrum. Recently, 3rd Generation Partnership Project (3GPP) announced standards that permit the operation of Long Term Evolution (LTE) in the unlicensed spectrum in addition to the exclusive use of the licensed spectrum owned by a mobile operator. Alternatively, leading wireless technology developers examine standalone LTE operation in the unlicensed spectrum without any involvement of a mobile operator. In this article, we present a classification of different techniques that can be applied on co-located LTE and Wi-Fi networks. Up to today, Wi-Fi is the most widely-used wireless technology in the unlicensed spectrum. A review of the current state of the art further reveals the lack of cooperation schemes among co-located networks that can lead to more optimal usage of the available spectrum. This article fills this gap in the literature by conceptually describing different classes of cooperation between LTE and Wi-Fi. For each class, we provide a detailed presentation of possible cooperation techniques that can provide spectral efficiency in a fair manner
Outage probability analysis of Co-Tier interference in heterogeneous network
In Heterogeneous Network (HetNet), the
femtocell (HeNB) has been deployed by the telecommunication
industries to provide extensive coverage as well as capacity in
an indoor. These HeNBs are Customer Premise Equipment
(CPE) which is randomly used in co-channel with macrocell
(MeNB) and causes the Co-Tier Interference (CTI) in OFDMA.
The effect of CTI in OFDMA systems can lead the system
throughput degradation and service disruption. Because of
quick direct changing features in Rayleigh channel, it is
compulsory to succeed the satisfactory performance. The
signal-to-interference noise ratio (SINR) is arbitrary which
drives the highest capacity to be an irregular variable.
However, this paper derives the expressions of outage
probabilities based on the hybrid Genetic Algorithm (GA) with
biogeography based dynamic subcarrier allocation
(HGBBDSA) algorithm is implemented in reducing the outage
probability. The outage probability countenance is expressed
for the moment-generating function of the total SINR at the
receivers end. The simulation results demonstrate that the
HGBBDSA can lessen the outage to 45 % than existing
methods
Recommended from our members
LTE-Advanced radio access enhancements: A survey
Long Term Evolution Advanced (LTE-Advanced) is the next step in LTE evolution and allows operators to improve network performance and service capabilities through smooth deployment of new techniques and technologies. LTE-Advanced uses some new features on top of the existing LTE standards to provide better user experience and higher throughputs. Some of the most significant features introduced in LTE-Advanced are carrier aggregation, enhancements in heterogeneous networks, coordinated multipoint transmission and reception, enhanced multiple input multiple output usage and deployment of relay nodes in the radio network. Mentioned features are mainly aimed to enhance the radio access part of the cellular networks. This survey article presents an overview of the key radio access features and functionalities of the LTE-Advanced radio access network, supported by the simulation results. We also provide a detailed review of the literature together with a very rich list of the references for each of the features. An LTE-Advanced roadmap and the latest updates and trends in LTE markets are also presented
Scalable RAN Virtualization in Multi-Tenant LTE-A Heterogeneous Networks (Extended version)
Cellular communications are evolving to facilitate the current and expected
increasing needs of Quality of Service (QoS), high data rates and diversity of
offered services. Towards this direction, Radio Access Network (RAN)
virtualization aims at providing solutions of mapping virtual network elements
onto radio resources of the existing physical network. This paper proposes the
Resources nEgotiation for NEtwork Virtualization (RENEV) algorithm, suitable
for application in Heterogeneous Networks (HetNets) in Long Term
Evolution-Advanced (LTE-A) environments, consisting of a macro evolved NodeB
(eNB) overlaid with small cells. By exploiting Radio Resource Management (RRM)
principles, RENEV achieves slicing and on demand delivery of resources.
Leveraging the multi-tenancy approach, radio resources are transferred in terms
of physical radio Resource Blocks (RBs) among multiple heterogeneous base
stations, interconnected via the X2 interface. The main target is to deal with
traffic variations in geographical dimension. All signaling design
considerations under the current Third Generation Partnership Project (3GPP)
LTE-A architecture are also investigated. Analytical studies and simulation
experiments are conducted to evaluate RENEV in terms of network's throughput as
well as its additional signaling overhead. Moreover we show that RENEV can be
applied independently on top of already proposed schemes for RAN virtualization
to improve their performance. The results indicate that significant merits are
achieved both from network's and users' perspective as well as that it is a
scalable solution for different number of small cells.Comment: 40 pages (including Appendices), Accepted for publication in the IEEE
Transactions on Vehicular Technolog
- …