580 research outputs found

    Performance Comparison of Dual Connectivity and Hard Handover for LTE-5G Tight Integration in mmWave Cellular Networks

    Get PDF
    MmWave communications are expected to play a major role in the Fifth generation of mobile networks. They offer a potential multi-gigabit throughput and an ultra-low radio latency, but at the same time suffer from high isotropic pathloss, and a coverage area much smaller than the one of LTE macrocells. In order to address these issues, highly directional beamforming and a very high-density deployment of mmWave base stations were proposed. This Thesis aims to improve the reliability and performance of the 5G network by studying its tight and seamless integration with the current LTE cellular network. In particular, the LTE base stations can provide a coverage layer for 5G mobile terminals, because they operate on microWave frequencies, which are less sensitive to blockage and have a lower pathloss. This document is a copy of the Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorzi. It will propose an LTE-5G tight integration architecture, based on mobile terminals' dual connectivity to LTE and 5G radio access networks, and will evaluate which are the new network procedures that will be needed to support it. Moreover, this new architecture will be implemented in the ns-3 simulator, and a thorough simulation campaign will be conducted in order to evaluate its performance, with respect to the baseline of handover between LTE and 5G.Comment: Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorz

    End-to-End Simulation of 5G mmWave Networks

    Full text link
    Due to its potential for multi-gigabit and low latency wireless links, millimeter wave (mmWave) technology is expected to play a central role in 5th generation cellular systems. While there has been considerable progress in understanding the mmWave physical layer, innovations will be required at all layers of the protocol stack, in both the access and the core network. Discrete-event network simulation is essential for end-to-end, cross-layer research and development. This paper provides a tutorial on a recently developed full-stack mmWave module integrated into the widely used open-source ns--3 simulator. The module includes a number of detailed statistical channel models as well as the ability to incorporate real measurements or ray-tracing data. The Physical (PHY) and Medium Access Control (MAC) layers are modular and highly customizable, making it easy to integrate algorithms or compare Orthogonal Frequency Division Multiplexing (OFDM) numerologies, for example. The module is interfaced with the core network of the ns--3 Long Term Evolution (LTE) module for full-stack simulations of end-to-end connectivity, and advanced architectural features, such as dual-connectivity, are also available. To facilitate the understanding of the module, and verify its correct functioning, we provide several examples that show the performance of the custom mmWave stack as well as custom congestion control algorithms designed specifically for efficient utilization of the mmWave channel.Comment: 25 pages, 16 figures, submitted to IEEE Communications Surveys and Tutorials (revised Jan. 2018

    Achieving Large Multiplexing Gain in Distributed Antenna Systems via Cooperation with pCell Technology

    Full text link
    In this paper we present pCellTM technology, the first commercial-grade wireless system that employs cooperation between distributed transceiver stations to create concurrent data links to multiple users in the same spectrum. First we analyze the per-user signal-to-interference-plus-noise ratio (SINR) employing a geometrical spatial channel model to define volumes in space of coherent signal around user antennas (or personal cells, i.e., pCells). Then we describe the system architecture consisting of a general-purpose-processor (GPP) based software-defined radio (SDR) wireless platform implementing a real-time LTE protocol stack to communicate with off-the-shelf LTE devices. Finally we present experimental results demonstrating up to 16 concurrent spatial channels for an aggregate average spectral efficiency of 59.3 bps/Hz in the downlink and 27.5 bps/Hz in the uplink, providing data rates of 200 Mbps downlink and 25 Mbps uplink in 5 MHz of TDD spectrum.Comment: IEEE Asilomar Conference on Signals, Systems, and Computers, Nov. 8-11th 2015, Pacific Grove, CA, US

    Measurements campaign to generate datasets to train algorithms and AI/ML models for 5G mobile communication systems

    Get PDF
    As the traffic demand in Mobile Communication Systems is constantly increasing, the new 5G NR standards need to adapt its Self-Organizing Network functions. In order to cope with its complexity, Artificial Intelligence (AI) and Machine Learning (ML) based techniques are being used. These techniques need appropriate datasets based on real-life data to be trained and evaluated because wireless networks are hard to simulate through mathematical models. This project aims to provide datasets useful for the testing of AI and ML algorithms based on data from the Campus Nord mobile network. Moreover, the second purpose of the project is to determine whether connectivity sharing techniques could be beneficial in order to improve the network performance for all its users. All the data has been gathered during several measurement campaigns using a terminal with an Android operating system. Two Mobile Network Tools have been used: Qualipoc Android, for the data measurement, and Romes Replay for the later visualization and exportation of the parameters needed. Finally, all the data processing, plotting and tabulation have been done using Matlab. By the end of the project, two datasets, one for the LTE and another for 5G NR have been created. These datasets along with several coverage and base station identifying maps provide the necessary information to characterize the mobile network in Campus Nord. In addition, it has been determined that connectivity sharing techniques are viable. It has been proved that the sharing user does not lose service quality and the service received by the second user is only limited by the capabilities of the connection with the first one.Debido a que la demanda en los Sistemas de Comunicaciones Móbiles está en constante crecimiento, los nuevos estándares de 5G necesitan adaptar sus funciones de auto-organización de la red. Con el fin de ocuparse de su complejidad, son necesarias técnicas basadas en Inteligencia Artificial (AI) y Machine lerning (ML). Estas técnicas, necesitan conjuntos de datos apropiados basados en datos reales para ser entrenadas y evaluadas, ya que las redes móviles son difíciles de simular mediante modelos matemáticos. Este proyecto está orientado a proporcionar conjuntos de medidas útiles para el testeo de algoritmos de AI y ML basados en datos extraídos de la red móvil del Campus Nord. Además, el segundo propósito del proyecto es determinar si las técnicas de conexión compartida podrían ser beneficioises para mejorar las prestaciones de la red para todos sus usuarios. Todos los datos han sido obtenidos durante varias campañas de medidas utilizando un terminal con sistema operativo Android. Dos herramientas de red móvil (Mobile Network Tools, MNT) han sido utilizadas: Qualipoc Android, para las medidas de datos, y Romes Replay, para la posterior visualización y exportación de los parámetros necesarios. Finalmente, todo el procesamiento de datos, graficación y tabulación han sido hechos mediante Matlab. Al final del proyecto, dos conjuntos de medidas, uno para LTE y el otro para 5G han sido creados. Estos conjuntos de medidas, junto con varios mapas de cobertura y de identificación de estaciones base, proporcionan la información necesaria para caracterizar la red móvil en el Campus Nord. Adicionalmente, ha sido determinado que las técnicas de conexión compartida son viables. Ha sido demostrado que el usuario que comparte conexión no pierde calidad de servicio y que el servicio recibido por el segundo usuario está únicamente limitado por las capacidades de la conexión con el primero.Degut a que la demanda en els Sistemes de Comunicacions Mòbils està en constant creixement, els nous estàndards de 5G necessiten adaptar les seves funcions d'auto-organització de la xarxa. Per tal d'ocupar-se de la seva compexitat, són necessàries tècniques basades en Intel·ligència Artificial (AI) i Machine Lerning (ML). Aquestes tècniques, necessiten conjunts de dades apropiats basats en dades reals per tal de ser entrenades i avaluades, ja que les xarxes mòbils són difícils de simular mitjançant models matemàtics. Aquest projecte està orientat a proporcionar conjunts de mesures útils pel testeig d'algoritmes d'AI i ML basats en dades extretes de la xarxa mòbil del Campus Nord. A més, el segon propòsit del projecte és determinar si les tècniques de connexió compartida podrien ser beneficioses per tal de millorar les prestacions de la xarxa per a tots els seus usuaris. Totes les dades han estat obtingudes durant diverses campanyes de mesures utilitzant un terminal amb sistema operatiu Android. Dues eines de xarxa mòbil (Mobile Network Tools, MNT) han estat emprades: Qualipoc Android, per les mesures de dades, i Romes Replay, per la posterior visualització i exportació dels paràmetres necessaris. Finalment, tot el processament de dades, graficació i tabulació han estat fets mitjançant Matlab. Al final del projecte, dos conjunts de mesures, un per LTE i l'altre per 5G han estat creats. Aquests conjunts de mesures, juntament amb diversos mapes de cobertura i d'identificació d'estacions base, proporcionen la informació necessària per caracteritzar la xarxa mòbil al Campus Nord. Addicionalment, ha estat determinat que les tècniques de connexió compartida són viables. Ha estat demostrat que l'usuari que comparteix connexió no perd qualitat de servei i que el servei rebut pel segon usuari està únicament limitat per les capacitats de la connexió amb el primer

    Prediction-based techniques for the optimization of mobile networks

    Get PDF
    Mención Internacional en el título de doctorMobile cellular networks are complex system whose behavior is characterized by the superposition of several random phenomena, most of which, related to human activities, such as mobility, communications and network usage. However, when observed in their totality, the many individual components merge into more deterministic patterns and trends start to be identifiable and predictable. In this thesis we analyze a recent branch of network optimization that is commonly referred to as anticipatory networking and that entails the combination of prediction solutions and network optimization schemes. The main intuition behind anticipatory networking is that knowing in advance what is going on in the network can help understanding potentially severe problems and mitigate their impact by applying solution when they are still in their initial states. Conversely, network forecast might also indicate a future improvement in the overall network condition (i.e. load reduction or better signal quality reported from users). In such a case, resources can be assigned more sparingly requiring users to rely on buffered information while waiting for the better condition when it will be more convenient to grant more resources. In the beginning of this thesis we will survey the current anticipatory networking panorama and the many prediction and optimization solutions proposed so far. In the main body of the work, we will propose our novel solutions to the problem, the tools and methodologies we designed to evaluate them and to perform a real world evaluation of our schemes. By the end of this work it will be clear that not only is anticipatory networking a very promising theoretical framework, but also that it is feasible and it can deliver substantial benefit to current and next generation mobile networks. In fact, with both our theoretical and practical results we show evidences that more than one third of the resources can be saved and even larger gain can be achieved for data rate enhancements.Programa Oficial de Doctorado en Ingeniería TelemáticaPresidente: Albert Banchs Roca.- Presidente: Pablo Serrano Yañez-Mingot.- Secretario: Jorge Ortín Gracia.- Vocal: Guevara Noubi

    Scheduling for Multi-Camera Surveillance in LTE Networks

    Full text link
    Wireless surveillance in cellular networks has become increasingly important, while commercial LTE surveillance cameras are also available nowadays. Nevertheless, most scheduling algorithms in the literature are throughput, fairness, or profit-based approaches, which are not suitable for wireless surveillance. In this paper, therefore, we explore the resource allocation problem for a multi-camera surveillance system in 3GPP Long Term Evolution (LTE) uplink (UL) networks. We minimize the number of allocated resource blocks (RBs) while guaranteeing the coverage requirement for surveillance systems in LTE UL networks. Specifically, we formulate the Camera Set Resource Allocation Problem (CSRAP) and prove that the problem is NP-Hard. We then propose an Integer Linear Programming formulation for general cases to find the optimal solution. Moreover, we present a baseline algorithm and devise an approximation algorithm to solve the problem. Simulation results based on a real surveillance map and synthetic datasets manifest that the number of allocated RBs can be effectively reduced compared to the existing approach for LTE networks.Comment: 9 pages, 10 figure

    The quality of service of the deployed LTE technology by mobile network operators in Abuja-Nigeria

    Get PDF
    In this study, the real-world performance analysis of four Nigerian mobile network operators (MNOs), namely MTN, GLO, Airtel, and 9Mobile long-term evolution (LTE) cellular network, were analyzed and compared. The Nigerian MNOs utilize 5 MHz, 10 MH, and 20 MHz channel bandwidths based on third-generation partnership project’s (3 GPPs) recommendation. The presented analysis shows the uplink (UL), and downlink (DL) throughputs gaps in mobility condition as well as other LTE’s system quality of service (QoS) key performance indicators (KPI’s) of: Connection drop rate, connection failure rate, peak physical downlink throughput, minimum radio link control (RLC) downlink throughput threshold and latency are not strictly followed. The reason may be due to a lack of regulatory oversight enforcement. The comparative studies showed that MTN provides the best QoS. The introduction of novel LTE QoS metrics herein referred to as national independent wireless broadband quality reporting (NIWBQR) is the significant contribution of this study. The goal of this study is to show the quality of the network as it affects the user's experience. Important observation showed that all the MNOs are not adhering to the 3 GPPs specified user plane latency of 30 ms and control plane latency of 100 ms, respectively, which makes video streaming and low latency communication a near-impossible task
    corecore