41 research outputs found
Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks
Soaring capacity and coverage demands dictate that future cellular networks
need to soon migrate towards ultra-dense networks. However, network
densification comes with a host of challenges that include compromised energy
efficiency, complex interference management, cumbersome mobility management,
burdensome signaling overheads and higher backhaul costs. Interestingly, most
of the problems, that beleaguer network densification, stem from legacy
networks' one common feature i.e., tight coupling between the control and data
planes regardless of their degree of heterogeneity and cell density.
Consequently, in wake of 5G, control and data planes separation architecture
(SARC) has recently been conceived as a promising paradigm that has potential
to address most of aforementioned challenges. In this article, we review
various proposals that have been presented in literature so far to enable SARC.
More specifically, we analyze how and to what degree various SARC proposals
address the four main challenges in network densification namely: energy
efficiency, system level capacity maximization, interference management and
mobility management. We then focus on two salient features of future cellular
networks that have not yet been adapted in legacy networks at wide scale and
thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and
device-to-device (D2D) communications. After providing necessary background on
CoMP and D2D, we analyze how SARC can particularly act as a major enabler for
CoMP and D2D in context of 5G. This article thus serves as both a tutorial as
well as an up to date survey on SARC, CoMP and D2D. Most importantly, the
article provides an extensive outlook of challenges and opportunities that lie
at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201
Device-to-Device Communication in 5G Cellular Networks
Owing to the unprecedented and continuous growth in the number of connected users and networked devices, the next-generation 5G cellular networks are envisaged to support enormous number of simultaneously connected users and devices with access to numerous services and applications by providing networks with highly improved data rate, higher capacity, lower end-to-end latency, improved spectral efficiency, at lower power consumption. D2D communication underlaying cellular networks has been proposed as one of the key components of the 5G technology as a means of providing efficient spectrum reuse for improved spectral efficiency and take advantage of proximity between devices for reduced latency, improved user throughput, and reduced power consumption. Although D2D communication underlaying cellular networks promises lots of potentials, unlike the conventional cellular network architecture, there are new design issues and technical challenges that must be addressed for proper implementation of the technology. These include new device discovery procedures, physical layer architecture and radio resource management schemes. This thesis explores the potentials of D2D communication as an underlay to 5G cellular networks and focuses on efficient interference management solutions through mode selection, resource allocation and power control schemes. In this work, a joint admission control, resource allocation, and power control scheme was implemented for D2D communication underlaying 5G cellular networks. The performance of the system was evaluated, and comparisons were made with similar schemes.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format
Comparison of vertical handover decision-based techniques in heterogeneous networks
Industry leaders are currently setting out standards for 5G Networks projected for 2020 or even sooner. Future generation networks will be heterogeneous in nature because no single network type is capable of optimally meeting all the rapid changes in customer demands. Heterogeneous networks are typically characterized by some network architecture, base stations of varying transmission power, transmission solutions and the deployment of a mix of technologies (multiple radio access technologies). In heterogeneous networks, the processes involved when a mobile node successfully switches from one radio access technology to the other for the purpose of quality of service continuity is termed vertical handover or vertical handoff. Active calls that get dropped, or cases where there is discontinuity of service experienced by mobile users can be attributed to the phenomenon of delayed handover or an outright case of an unsuccessful handover procedure. This dissertation analyses the performance of a fuzzy-based VHO algorithm scheme in a Wi-Fi, WiMAX, UMTS and LTE integrated network using the OMNeT++ discrete event simulator. The loose coupling type network architecture is adopted and results of the simulation are analysed and compared for the two major categories of handover basis; multiple and single criteria based handover methods. The key performance indices from the simulations showed better overall throughput, better call dropped rate and shorter handover time duration for the multiple criteria based decision method compared to the single criteria based technique. This work also touches on current trends, challenges in area of seamless handover and initiatives for future Networks (Next Generation Heterogeneous Networks)
Will SDN be part of 5G?
For many, this is no longer a valid question and the case is considered
settled with SDN/NFV (Software Defined Networking/Network Function
Virtualization) providing the inevitable innovation enablers solving many
outstanding management issues regarding 5G. However, given the monumental task
of softwarization of radio access network (RAN) while 5G is just around the
corner and some companies have started unveiling their 5G equipment already,
the concern is very realistic that we may only see some point solutions
involving SDN technology instead of a fully SDN-enabled RAN. This survey paper
identifies all important obstacles in the way and looks at the state of the art
of the relevant solutions. This survey is different from the previous surveys
on SDN-based RAN as it focuses on the salient problems and discusses solutions
proposed within and outside SDN literature. Our main focus is on fronthaul,
backward compatibility, supposedly disruptive nature of SDN deployment,
business cases and monetization of SDN related upgrades, latency of general
purpose processors (GPP), and additional security vulnerabilities,
softwarization brings along to the RAN. We have also provided a summary of the
architectural developments in SDN-based RAN landscape as not all work can be
covered under the focused issues. This paper provides a comprehensive survey on
the state of the art of SDN-based RAN and clearly points out the gaps in the
technology.Comment: 33 pages, 10 figure
Towards fostering the role of 5G networks in the field of digital health
A typical healthcare system needs further participation with patient monitoring, vital signs sensors and other medical devices. Healthcare moved from a traditional central hospital to scattered patients. Healthcare systems receive help from emerging technology innovations such as fifth generation (5G) communication infrastructure: internet of things (IoT), machine learning (ML), and artificial intelligence (AI). Healthcare providers benefit from IoT capabilities to comfort patients by using smart appliances that improve the healthcare level they receive. These IoT smart healthcare gadgets produce massive data volume. It is crucial to use very high-speed communication networks such as 5G wireless technology with the increased communication bandwidth, data transmission efficiency and reduced communication delay and latency, thus leading to strengthen the precise requirements of healthcare big data utilities. The adaptation of 5G in smart healthcare networks allows increasing number of IoT devices that supplies an augmentation in network performance. This paper reviewed distinctive aspects of internet of medical things (IoMT) and 5G architectures with their future and present sides, which can lead to improve healthcare of patients in the near future
Recommended from our members
Small cells deployment for traffic handling in centralized heterogeneous network
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonAs the next phase of mobile technology, 5G is coming with a new vision that is characterized by a connected society, in which everything will be effectively connected, providing a variety of services and diverse business models that require more than just higher data rates and more capacity to target new kinds of ultra-reliable and flexible connection. However, next generation of applications, services and use cases will have extreme variation in requirements which in turn amplified the demand on the network resources. Therefore, 5G will require a whole new design that take into consideration efficient resource management and utilisation. An observation that was made throughout this research refers to the demand for more capacity, reduced latency, and increased density as common factors of many of the next generation use cases. This inescapably implies that the use of small cells is an ideal solution for next generation applications requirements, provided that the necessary storage and computing resources need to be distributed closer to the actual user. In this context, this research proposed an architecture of a centralised heterogenous network, consisting of Macro and Small cells with storage and computing resources, all controlled by a centralized functionality embedded within a gateway at the edge of the network. Compared to the basic network, the proposed solutions have been proven to provide overall system performance enhancement. This involves extending the system by adding small cells to serve dedicated services for User Equipment (UE) with dual connectivity from local server which reduces the overall system delay while increasing the overall system throughput. The added centralized mobility management was proven to be capable of tracing the mobility of the UEs within the system coverage, by keeping one connection with the main cell while moving between small cells resulting in enhancement to the handover delay by 11% without service interruptions. Finally, the proposed slicing model demonstrated the system’s ability to provide different levels of services to users based on different Quality of Service (QoS) requirements and to differentiate between various applications without affecting the performance of other services, benefiting from more flexible infrastructure than the traditional network. In addition, a 50% improvement in the performance was observed in terms of the CPU utilization. In such architecture, the required capacity can be added exactly where it is needed and when it is needed, coverage problems can be directly addressed, higher throughput, lower latency, and efficient mobility management can be achieved as a result of efficient resource management and distribution which is one of key factors in the deployment of next generation mobile network system
Towards UAV Assisted 5G Public Safety Network
Ensuring ubiquitous mission-critical public safety communications (PSC) to all the first responders in the public safety network is crucial at an emergency site. The first responders heavily rely on mission-critical PSC to save lives, property, and national infrastructure during a natural or human-made emergency. The recent advancements in LTE/LTE-Advanced/5G mobile technologies supported by unmanned aerial vehicles (UAV) have great potential to revolutionize PSC.
However, limited spectrum allocation for LTE-based PSC demands improved channel capacity and spectral efficiency. An additional challenge in designing an LTE-based PSC network is achieving at least 95% coverage of the geographical area and human population with broadband rates. The coverage requirement and efficient spectrum use in the PSC network can be realized through the dense deployment of small cells (both terrestrial and aerial). However, there are several challenges with the dense deployment of small cells in an air-ground heterogeneous network (AG-HetNet). The main challenges which are addressed in this research work are integrating UAVs as both aerial user and aerial base-stations, mitigating inter-cell interference, capacity and coverage enhancements, and optimizing deployment locations of aerial base-stations.
First, LTE signals were investigated using NS-3 simulation and software-defined radio experiment to gain knowledge on the quality of service experienced by the user equipment (UE). Using this understanding, a two-tier LTE-Advanced AG-HetNet with macro base-stations and unmanned aerial base-stations (UABS) is designed, while considering time-domain inter-cell interference coordination techniques. We maximize the capacity of this AG-HetNet in case of a damaged PSC infrastructure by jointly optimizing the inter-cell interference parameters and UABS locations using a meta-heuristic genetic algorithm (GA) and the brute-force technique. Finally, considering the latest specifications in 3GPP, a more realistic three-tier LTE-Advanced AG-HetNet is proposed with macro base-stations, pico base-stations, and ground UEs as terrestrial nodes and UABS and aerial UEs as aerial nodes. Using meta-heuristic techniques such as GA and elitist harmony search algorithm based on the GA, the critical network elements such as energy efficiency, inter-cell interference parameters, and UABS locations are all jointly optimized to maximize the capacity and coverage of the AG-HetNet
Recommended from our members
Neural network design for intelligent mobile network optimisation
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe mobile networks users’ demands for data services are increasing exponentially, this is due to two main factors: the first is the evolution of smart phones and their application, and the second is the emerging new technologies for internet of things, smart cities…etc, which keeps pumping more data into the network; ‘though most of the data routed in the current mobile network is non-live data’. This increasing of demands arise the necessity for the mobile network operators to keep improving their network to satisfy it, this improvement takes place via adding hardware or increasing the resources or a combination of both. The radio resources are strictly limited due to spectrum licensing and availability, therefore efficient spectrum utilization is a major goal to be achieved for both network operators and developers. Simultaneous and multiple channel access,and adding more cells to the network are ways used to increase the data exchanged between the network nodes. The current 4G mobile system is based on the Orthogonal Frequency Division Multiple Access (OFDMA) for accessing the medium and the intercell interference degrades the link quality at the cell edge, with the introduction of heterogeneity concept to the LTE in Release 10 of the 3GPP the handover process became even more complex. To mitigate the intercell interference at the cell edge, coordinated multipoint and carrier aggregation techniques are utilized for dual connectivity. This work is focused on designing and proposing enhancing features to improve network performance and sustainability, these features comprises of distributing small cells for data only transmission, handover schemes performance evaluation at cell edge with dual connectivity, and Artificial Intelligence technology for balancing and prediction. In the proposed model design the data and controls of the Small eNodeB (SeNodeB) are processed at the network edge using a Mobile Edge Computing (MEC) server and the SeNodeBs are used to boost services provided to the users, also the concept of caching data has been investigated, the caching units where implemented in different network levels. The proposed system and resource management are simulated using the OPNET modeller and evaluated through multiple scenarios with and without full load, the UE is reconfigured to accommodate dual connectivity and have two separate connections for uplink and downlink, while maintaining connection to the Macro cell via uplink, the downlink is dedicated for small cells when content is requested from the cache. The results clearly show that the proposed system can decrease the latency while the total throughput delivered by the network has highly improved when SeNodeBs are deployed in the system, rising throughput will incur the rise of overall capacity which leads to better services being provided to the users or more users to join and benefit from the network. Handover improvement is also considered in this work, with the help of two Artificial Intelligence (AI) entities better handover performance are achieved. Balanced load over the SeNodeBs results in less frequent handover, the proposed load balancer is based on artificial neural network clustering model with self-organizing map as a hidden layer, it’s trained to forecast the network condition and learn to reduce the number of handovers especially for the UEs at the cell edge by performing only necessary ones, and avoid handovers to the Macro cell for the downlink direction. The examined handovers concern the downlinks when routing non live video stored at the small cell’s cache, and a reduction in the frequent handovers was achieved when running the balancer. Keep revolving in the handover orbit, another way to preserve and utilize network resources is by predicting the handovers before they occur, and allocate the required data in the target SeNodeB, the predictor entity in the proposed system architecture combines the features of Radial Basis Function Neural Network and neural network time series tool to create and update prediction list from the system’s collected data and learn to predict the next SeNodeB to associate with. The prediction entity is simulated using MATLAB, and the results shows that the system was able to deliver up to 92% correct predictions for handovers which led to overall throughput improvement of 75%