254 research outputs found

    Millimeter-wave Evolution for 5G Cellular Networks

    Full text link
    Triggered by the explosion of mobile traffic, 5G (5th Generation) cellular network requires evolution to increase the system rate 1000 times higher than the current systems in 10 years. Motivated by this common problem, there are several studies to integrate mm-wave access into current cellular networks as multi-band heterogeneous networks to exploit the ultra-wideband aspect of the mm-wave band. The authors of this paper have proposed comprehensive architecture of cellular networks with mm-wave access, where mm-wave small cell basestations and a conventional macro basestation are connected to Centralized-RAN (C-RAN) to effectively operate the system by enabling power efficient seamless handover as well as centralized resource control including dynamic cell structuring to match the limited coverage of mm-wave access with high traffic user locations via user-plane/control-plane splitting. In this paper, to prove the effectiveness of the proposed 5G cellular networks with mm-wave access, system level simulation is conducted by introducing an expected future traffic model, a measurement based mm-wave propagation model, and a centralized cell association algorithm by exploiting the C-RAN architecture. The numerical results show the effectiveness of the proposed network to realize 1000 times higher system rate than the current network in 10 years which is not achieved by the small cells using commonly considered 3.5 GHz band. Furthermore, the paper also gives latest status of mm-wave devices and regulations to show the feasibility of using mm-wave in the 5G systems.Comment: 17 pages, 12 figures, accepted to be published in IEICE Transactions on Communications. (Mar. 2015

    An Innovative RAN Architecture for Emerging Heterogeneous Networks: The Road to the 5G Era

    Full text link
    The global demand for mobile-broadband data services has experienced phenomenal growth over the last few years, driven by the rapid proliferation of smart devices such as smartphones and tablets. This growth is expected to continue unabated as mobile data traffic is predicted to grow anywhere from 20 to 50 times over the next 5 years. Exacerbating the problem is that such unprecedented surge in smartphones usage, which is characterized by frequent short on/off connections and mobility, generates heavy signaling traffic load in the network signaling storms . This consumes a disproportion amount of network resources, compromising network throughput and efficiency, and in extreme cases can cause the Third-Generation (3G) or 4G (long-term evolution (LTE) and LTE-Advanced (LTE-A)) cellular networks to crash. As the conventional approaches of improving the spectral efficiency and/or allocation additional spectrum are fast approaching their theoretical limits, there is a growing consensus that current 3G and 4G (LTE/LTE-A) cellular radio access technologies (RATs) won\u27t be able to meet the anticipated growth in mobile traffic demand. To address these challenges, the wireless industry and standardization bodies have initiated a roadmap for transition from 4G to 5G cellular technology with a key objective to increase capacity by 1000Ã? by 2020 . Even though the technology hasn\u27t been invented yet, the hype around 5G networks has begun to bubble. The emerging consensus is that 5G is not a single technology, but rather a synergistic collection of interworking technical innovations and solutions that collectively address the challenge of traffic growth. The core emerging ingredients that are widely considered the key enabling technologies to realize the envisioned 5G era, listed in the order of importance, are: 1) Heterogeneous networks (HetNets); 2) flexible backhauling; 3) efficient traffic offload techniques; and 4) Self Organizing Networks (SONs). The anticipated solutions delivered by efficient interworking/ integration of these enabling technologies are not simply about throwing more resources and /or spectrum at the challenge. The envisioned solution, however, requires radically different cellular RAN and mobile core architectures that efficiently and cost-effectively deploy and manage radio resources as well as offload mobile traffic from the overloaded core network. The main objective of this thesis is to address the key techno-economics challenges facing the transition from current Fourth-Generation (4G) cellular technology to the 5G era in the context of proposing a novel high-risk revolutionary direction to the design and implementation of the envisioned 5G cellular networks. The ultimate goal is to explore the potential and viability of cost-effectively implementing the 1000x capacity challenge while continuing to provide adequate mobile broadband experience to users. Specifically, this work proposes and devises a novel PON-based HetNet mobile backhaul RAN architecture that: 1) holistically addresses the key techno-economics hurdles facing the implementation of the envisioned 5G cellular technology, specifically, the backhauling and signaling challenges; and 2) enables, for the first time to the best of our knowledge, the support of efficient ground-breaking mobile data and signaling offload techniques, which significantly enhance the performance of both the HetNet-based RAN and LTE-A\u27s core network (Evolved Packet Core (EPC) per 3GPP standard), ensure that core network equipment is used more productively, and moderate the evolving 5G\u27s signaling growth and optimize its impact. To address the backhauling challenge, we propose a cost-effective fiber-based small cell backhaul infrastructure, which leverages existing fibered and powered facilities associated with a PON-based fiber-to-the-Node/Home (FTTN/FTTH)) residential access network. Due to the sharing of existing valuable fiber assets, the proposed PON-based backhaul architecture, in which the small cells are collocated with existing FTTN remote terminals (optical network units (ONUs)), is much more economical than conventional point-to-point (PTP) fiber backhaul designs. A fully distributed ring-based EPON architecture is utilized here as the fiber-based HetNet backhaul. The techno-economics merits of utilizing the proposed PON-based FTTx access HetNet RAN architecture versus that of traditional 4G LTE-A\u27s RAN will be thoroughly examined and quantified. Specifically, we quantify the techno-economics merits of the proposed PON-based HetNet backhaul by comparing its performance versus that of a conventional fiber-based PTP backhaul architecture as a benchmark. It is shown that the purposely selected ring-based PON architecture along with the supporting distributed control plane enable the proposed PON-based FTTx RAN architecture to support several key salient networking features that collectively significantly enhance the overall performance of both the HetNet-based RAN and 4G LTE-A\u27s core (EPC) compared to that of the typical fiber-based PTP backhaul architecture in terms of handoff capability, signaling overhead, overall network throughput and latency, and QoS support. It will also been shown that the proposed HetNet-based RAN architecture is not only capable of providing the typical macro-cell offloading gain (RAN gain) but also can provide ground-breaking EPC offloading gain. The simulation results indicate that the overall capacity of the proposed HetNet scales with the number of deployed small cells, thanks to LTE-A\u27s advanced interference management techniques. For example, if there are 10 deployed outdoor small cells for every macrocell in the network, then the overall capacity will be approximately 10-11x capacity gain over a macro-only network. To reach the 1000x capacity goal, numerous small cells including 3G, 4G, and WiFi (femtos, picos, metros, relays, remote radio heads, distributed antenna systems) need to be deployed indoors and outdoors, at all possible venues (residences and enterprises)

    Impact of propagation model on capacity in small-cell networks

    Get PDF
    This work evaluates the impact of different path loss models on capacity of small cell (SC) networks, including the relationship between cell size and capacity. We compare four urban path loss models: the urban/vehicular and pedestrian test environment from the ITU-R M. 1255 Report, and the two-slope Micro Urban Line-of-Sight (LoS) and Non-Line-of-Sight (NLoS) models from the ITU-R 2135 Report. We show that when using the ITU-R two-slope model that considers the existence of a break-point in the behaviour of path loss, for coverage distances, R, up to break-point distance divided by reuse factor, supported cell throughput, Rb-sup, is much lower than expected when traditional single-slope models are assumed. For Rs longer than dBP/rcc the results for Rb-sup increase with R, whereas they are steady or decrease with R when using the traditional single-slope propagation models. We conclude that the two-slope propagation model yields a significantly lower throughput per square km than a traditional one-slope model if and only if cell radius is small.info:eu-repo/semantics/publishedVersio

    MM-Wave HetNet in 5G and beyond Cellular Networks Reinforcement Learning Method to improve QoS and Exploiting Path Loss Model

    Get PDF
    This paper presents High density heterogeneous networks (HetNet) which are the most promising technology for the fifth generation (5G) cellular network. Since 5G will be available for a long time, previous generation networking systems will need customization and updates. We examine the merits and drawbacks of legacy and Q-Learning (QL)-based adaptive resource allocation systems. Furthermore, various comparisons between methods and schemes are made for the purpose of evaluating the solutions for future generation. Microwave macro cells are used to enable extra high capacity such as Long-Term Evolution (LTE), eNodeB (eNB), and Multimedia Communications Wireless technology (MC), in which they are most likely to be deployed. This paper also presents four scenarios for 5G mm-Wave implementation, including proposed system architectures. The WL algorithm allocates optimal power to the small cell base station (SBS) to satisfy the minimum necessary capacity of macro cell user equipment (MUEs) and small cell user equipment (SCUEs) in order to provide quality of service (QoS) (SUEs). The challenges with dense HetNet and the massive backhaul traffic they generate are discussed in this study. Finally, a core HetNet design based on clusters is aimed at reducing backhaul traffic. According to our findings, MM-wave HetNet and MEC can be useful in a wide range of applications, including ultra-high data rate and low latency communications in 5G and beyond. We also used the channel model simulator to examine the directional power delay profile with received signal power, path loss, and path loss exponent (PLE) for both LOS and NLOS using uniform linear array (ULA) 2X2 and 64x16 antenna configurations at 38 GHz and 73 GHz mmWave bands for both LOS and NLOS (NYUSIM). The simulation results show the performance of several path loss models in the mmWave and sub-6 GHz bands. The path loss in the close-in (CI) model at mmWave bands is higher than that of open space and two ray path loss models because it considers all shadowing and reflection effects between transmitter and receiver. We also compared the suggested method to existing models like Amiri, Su, Alsobhi, Iqbal, and greedy (non adaptive), and found that it not only enhanced MUE and SUE minimum capacities and reduced BT complexity, but it also established a new minimum QoS threshold. We also talked about 6G researches in the future. When compared to utilizing the dual slope route loss model alone in a hybrid heterogeneous network, our simulation findings show that decoupling is more visible when employing the dual slope path loss model, which enhances system performance in terms of coverage and data rate

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Towards UAV Assisted 5G Public Safety Network

    Get PDF
    Ensuring ubiquitous mission-critical public safety communications (PSC) to all the first responders in the public safety network is crucial at an emergency site. The first responders heavily rely on mission-critical PSC to save lives, property, and national infrastructure during a natural or human-made emergency. The recent advancements in LTE/LTE-Advanced/5G mobile technologies supported by unmanned aerial vehicles (UAV) have great potential to revolutionize PSC. However, limited spectrum allocation for LTE-based PSC demands improved channel capacity and spectral efficiency. An additional challenge in designing an LTE-based PSC network is achieving at least 95% coverage of the geographical area and human population with broadband rates. The coverage requirement and efficient spectrum use in the PSC network can be realized through the dense deployment of small cells (both terrestrial and aerial). However, there are several challenges with the dense deployment of small cells in an air-ground heterogeneous network (AG-HetNet). The main challenges which are addressed in this research work are integrating UAVs as both aerial user and aerial base-stations, mitigating inter-cell interference, capacity and coverage enhancements, and optimizing deployment locations of aerial base-stations. First, LTE signals were investigated using NS-3 simulation and software-defined radio experiment to gain knowledge on the quality of service experienced by the user equipment (UE). Using this understanding, a two-tier LTE-Advanced AG-HetNet with macro base-stations and unmanned aerial base-stations (UABS) is designed, while considering time-domain inter-cell interference coordination techniques. We maximize the capacity of this AG-HetNet in case of a damaged PSC infrastructure by jointly optimizing the inter-cell interference parameters and UABS locations using a meta-heuristic genetic algorithm (GA) and the brute-force technique. Finally, considering the latest specifications in 3GPP, a more realistic three-tier LTE-Advanced AG-HetNet is proposed with macro base-stations, pico base-stations, and ground UEs as terrestrial nodes and UABS and aerial UEs as aerial nodes. Using meta-heuristic techniques such as GA and elitist harmony search algorithm based on the GA, the critical network elements such as energy efficiency, inter-cell interference parameters, and UABS locations are all jointly optimized to maximize the capacity and coverage of the AG-HetNet

    ENERGY EFFICIENCY VIA HETEROGENEOUS NETWORK

    Get PDF
    The mobile telecommunication industry is growing at a phenomenal rate. On a daily basis, there are continuous inflow of mobile users and sophisticated devices into the mobile network. This has triggered a meteoric rise in mobile traffic; forcing network operators to embark on a series of projects to increase the capacity and coverage of mobile networks in line with growing traffic demands. A corollary to this development is the momentous rise in energy bills for mobile operators and the emission of a significant amount of CO2 into the atmosphere. This has become worrisome to the extent that regulatory bodies and environmentalist are calling for the adoption of more “green operation” to curtail these challenges. Green communication is an all-inclusive approach that champions the cause of overall network improvement, reduction in energy consumption and mitigation of carbon emission. The emergence of Heterogeneous network came as a means of fulfilling the vision of Green communication. Heterogeneous network is a blend of low power node overlaid on Macrocell to offload traffic from the Macrocell and enhance quality of service of cell edge users. Heterogeneous network seeks to boost the performance of LTE-Advanced beyond its present limit, and at the same time, reduce energy consumption in mobile wireless network. In this thesis, we explore the potential of heterogeneous network in enhancing the energy efficiency of mobile wireless network. Simulation process sees the use of a co-deployment of Macrocell and Picocell in cluster (Hot spot) and normal scenario. Finally, we compared the performance of each scenario using Cell Energy Efficiency and the Area Energy Efficiency as our performance metricfi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format

    Impact of considering the ITU-R two slope propagation model in the system capacity trade-off for LTE-A HetNets with small cells

    Get PDF
    This work aims at understanding and evaluating the impact of using different path loss models in the optimization trade-off of small cell (SC) networks. In LTE-A, the more realistic propagation models are the more efficient the radio and network optimization becomes. In this work we compare four urban path loss models: the urban/vehicular and pedestrian test environment from the ITU-R M. 1255 Report as well as the two slope Micro Urban Line-of-Sight (LoS) and Non-Line-of-Sight (NLoS) from the ITU-R 2135 Report. The two-slope model considers the existence of a breakpoint in the behaviour of the path loss and yields a significantly lower throughput per square km than a traditional one-slope model if and only if cell radius is small (coverage distances, R, up to breakpoint distance divided by the reuse pattern).info:eu-repo/semantics/publishedVersio
    corecore