2,257 research outputs found

    Improved Handover Through Dual Connectivity in 5G mmWave Mobile Networks

    Full text link
    The millimeter wave (mmWave) bands offer the possibility of orders of magnitude greater throughput for fifth generation (5G) cellular systems. However, since mmWave signals are highly susceptible to blockage, channel quality on any one mmWave link can be extremely intermittent. This paper implements a novel dual connectivity protocol that enables mobile user equipment (UE) devices to maintain physical layer connections to 4G and 5G cells simultaneously. A novel uplink control signaling system combined with a local coordinator enables rapid path switching in the event of failures on any one link. This paper provides the first comprehensive end-to-end evaluation of handover mechanisms in mmWave cellular systems. The simulation framework includes detailed measurement-based channel models to realistically capture spatial dynamics of blocking events, as well as the full details of MAC, RLC and transport protocols. Compared to conventional handover mechanisms, the study reveals significant benefits of the proposed method under several metrics.Comment: 16 pages, 13 figures, to appear on the 2017 IEEE JSAC Special Issue on Millimeter Wave Communications for Future Mobile Network

    Performance Comparison of Dual Connectivity and Hard Handover for LTE-5G Tight Integration in mmWave Cellular Networks

    Get PDF
    MmWave communications are expected to play a major role in the Fifth generation of mobile networks. They offer a potential multi-gigabit throughput and an ultra-low radio latency, but at the same time suffer from high isotropic pathloss, and a coverage area much smaller than the one of LTE macrocells. In order to address these issues, highly directional beamforming and a very high-density deployment of mmWave base stations were proposed. This Thesis aims to improve the reliability and performance of the 5G network by studying its tight and seamless integration with the current LTE cellular network. In particular, the LTE base stations can provide a coverage layer for 5G mobile terminals, because they operate on microWave frequencies, which are less sensitive to blockage and have a lower pathloss. This document is a copy of the Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorzi. It will propose an LTE-5G tight integration architecture, based on mobile terminals' dual connectivity to LTE and 5G radio access networks, and will evaluate which are the new network procedures that will be needed to support it. Moreover, this new architecture will be implemented in the ns-3 simulator, and a thorough simulation campaign will be conducted in order to evaluate its performance, with respect to the baseline of handover between LTE and 5G.Comment: Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorz

    Performance Comparison of Dual Connectivity and Hard Handover for LTE-5G Tight Integration

    Full text link
    Communications at frequencies above 10 GHz (the mmWave band) are expected to play a major role for the next generation of cellular networks (5G), because of the potential multi-gigabit, ultra-low latency performance of this technology. mmWave frequencies however suffer from very high isotropic pathloss, which may result in cells with a much smaller coverage area than current LTE macrocells. High directionality techniques will be used to improve signal quality and extend coverage area, along with a high density deployment of mmWave base stations (BS). However, when propagation conditions are hard and it is difficult to provide high quality coverage with mmWave BS, it is necessary to rely on previous generation LTE base stations, which make use of lower frequencies (900 MHz - 3.5 GHz), which are less sensitive to blockage and experience lower pathloss. In order to provide ultra-reliable services to mobile users there is a need for network architectures that tightly and seamlessly integrate the LTE and mmWave Radio Access Technologies. In this paper we will present two possible alternatives for this integration and show how simulation tools can be used to assess and compare their performance.Comment: This paper was accepted for presentation at the ninth EAI SIMUtools 2016 conference, August 22 - 23, 2016, Prague, Czech Republi

    Optimized Performance Evaluation of LTE Hard Handover Algorithm with Average RSRP Constraint

    Full text link
    Hard handover mechanism is adopted to be used in 3GPP Long Term Evolution (3GPP LTE) in order to reduce the complexity of the LTE network architecture. This mechanism comes with degradation in system throughput as well as a higher system delay. This paper proposes a new handover algorithm known as LTE Hard Handover Algorithm with Average Received Signal Reference Power (RSRP) Constraint (LHHAARC) in order to minimize number of handovers and the system delay as well as maximize the system throughput. An optimized system performance of the LHHAARC is evaluated and compared with three well-known handover algorithms via computer simulation. The simulation results show that the LHHAARC outperforms three well-known handover algorithms by having less number of average handovers per UE per second, shorter total system delay whilst maintaining a higher total system throughput.Comment: 16 pages, 9 figures, International Journal of Wireless & Mobile Networks (IJWMN

    Will TCP work in mmWave 5G Cellular Networks?

    Full text link
    The vast available spectrum in the millimeter wave (mmWave) bands offers the possibility of multi-Gbps data rates for fifth generation (5G) cellular networks. However, mmWave capacity can be highly intermittent due to the vulnerability of mmWave signals to blockages and delays in directional searching. Such highly variable links present unique challenges for adaptive control mechanisms in transport layer protocols and end-to-end applications. This paper considers the fundamental question of whether TCP - the most widely used transport protocol - will work in mmWave cellular systems. The paper provides a comprehensive simulation study of TCP considering various factors such as the congestion control algorithm, including the recently proposed TCP BBR, edge vs. remote servers, handover and multi- connectivity, TCP packet size and 3GPP-stack parameters. We show that the performance of TCP on mmWave links is highly dependent on different combinations of these parameters, and identify the open challenges in this area.Comment: 7 pages, 4 figures, 2 tables. To be published in the IEEE Communication Magazin

    A Machine Learning based Framework for KPI Maximization in Emerging Networks using Mobility Parameters

    Full text link
    Current LTE network is faced with a plethora of Configuration and Optimization Parameters (COPs), both hard and soft, that are adjusted manually to manage the network and provide better Quality of Experience (QoE). With 5G in view, the number of these COPs are expected to reach 2000 per site, making their manual tuning for finding the optimal combination of these parameters, an impossible fleet. Alongside these thousands of COPs is the anticipated network densification in emerging networks which exacerbates the burden of the network operators in managing and optimizing the network. Hence, we propose a machine learning-based framework combined with a heuristic technique to discover the optimal combination of two pertinent COPs used in mobility, Cell Individual Offset (CIO) and Handover Margin (HOM), that maximizes a specific Key Performance Indicator (KPI) such as mean Signal to Interference and Noise Ratio (SINR) of all the connected users. The first part of the framework leverages the power of machine learning to predict the KPI of interest given several different combinations of CIO and HOM. The resulting predictions are then fed into Genetic Algorithm (GA) which searches for the best combination of the two mentioned parameters that yield the maximum mean SINR for all users. Performance of the framework is also evaluated using several machine learning techniques, with CatBoost algorithm yielding the best prediction performance. Meanwhile, GA is able to reveal the optimal parameter setting combination more efficiently and with three orders of magnitude faster convergence time in comparison to brute force approach
    corecore