21 research outputs found

    Simulating LTE/LTE-Advanced Networks with SimuLTE

    Get PDF
    In this work we present SimuLTE, an OMNeT++-based simulator for LTE and LTE-Advanced networks. Following well-established OMNeT++ programming practices, SimuLTE exhibits a fully modular structure, which makes it easy to be extended, verified, and integrated. Moreover, it inherits all the benefits of such a widely used and versatile simulation framework as OMNeT++, i.e., experiment support and seamless integration with the OMNeT++ network modules, such as INET. This allows SimuLTE users to build up mixed scenarios where LTE is only a part of a wider network. This paper describes the architecture of SimuLTE, with particular emphasis on the modeling choices at the MAC layer, where resource scheduling is located. Furthermore, we describe some of the verification and validation efforts and present an example of the performance analysis that can be carried out with SimuLTE

    Scheduling in 5G networks : Developing a 5G cell capacity simulator.

    Get PDF
    La quinta generación de comunicaciones móviles (5G) se está convirtiendo en una realidad gracias a la nueva tecnología 3GPP (3rd Generation Partnership Project) diseñada para cumplir con una amplia gama de requerimientos. Por un lado, debe poder soportar altas velocidades y servicios de latencia ultra-baja, y por otro lado, debe poder conectar una gran cantidad de dispositivos con requerimientos laxos de ancho de banda y retardo. Esta diversidad de requerimientos de servicio exige un alto grado de flexibilidad en el diseño de la interfaz de radio. Dado que la tecnología LTE (Long Term Evolution) se diseñó originalmente teniendo en cuenta la evolución de los servicios de banda ancha móvil, no proporciona suficiente flexibilidad para multiplexar de manera óptima los diferentes tipos de servicios previstos por 5G. Esto se debe a que no existe una única configuración de interfaz de radio capaz de adaptarse a todos los diferentes requisitos de servicio. Como consecuencia, las redes 5G se están diseñando para admitir diferentes configuraciones de interfaz de radio y mecanismos para multiplexar estos diferentes servicios con diferentes configuraciones en el mismo espectro disponible. Este concepto se conoce como Network Slicing y es una característica clave de 5G que debe ser soportada extremo a extremo en la red (acceso, transporte y núcleo). De esta manera, las Redes de Acceso (RAN) 5G agregarán el problema de asignación de recursos para diferentes servicios al problema tradicional de asignación de recursos a distintos usuarios. En este contexto, como el estándar no describe cómo debe ser la asignación de recursos para usuarios y servicios (quedando libre a la implementación de los proveedores) se abre un amplio campo de investigación. Se han desarrollado diferentes herramientas de simulación con fines de investigación durante los últimos años. Sin embargo, como no muchas de estas son libres, fáciles de usar y particularmente ninguna de las disponibles soporta Network Slicing a nivel de Red de Acceso, este trabajo presenta un nuevo simulador como principal contribución. Py5cheSim es un simulador simple, flexible y de código abierto basado en Python y especialmente orientado a probar diferentes algoritmos de scheduling para diferentes tipos de servicios 5G mediante una implementación simple de la funcionalidad RAN Slicing. Su arquitectura permite desarrollar e integrar nuevos algoritmos para asignación de recursos de forma sencilla y directa. Además, el uso de Python proporciona suficiente versatilidad para incluso utilizar herramientas de Inteligencia Artificial para el desarrollo de nuevos algoritmos. Este trabajo presenta los principales conceptos de diseño de las redes de acceso 5G que se tomaron como base para desarrollar la herramienta de simulación. También describe decisiones de diseño e implementación, seguidas de las pruebas de validación ejecutadas y sus principales resultados. Se presentan además algunos ejemplos de casos de uso para mostrar el potencial de la herramienta desarrollada, proporcionando un análisis primario de los algoritmos tradicionales de asignación de recursos para los nuevos tipos de servicios previstos por la tecnología. Finalmente se concluye sobre la contribución de la herramienta desarrollada, los resultados de los ejemplos incluyendo posibles líneas de investigación junto con posibles mejoras para futuras versiones.The fifth generation of mobile communications (5G) is already becoming a reality by the new 3GPP (3rd Generation Partnership Project) technology designed to solve a wide range of requirements. On the one hand, it must be able to support high bit rates and ultra-low latency services, and on the other hand, it should be able to connect a massive amount of devices with loose band width and delay requirements. Such diversity in terms of service requirements demands a high degree of flexibility in radio interface design. As LTE (Long Term Evolution) technology was originally designed with Mobile Broadband (MBB) services evolution in mind it does not provide enough flexibility to multiplex optimally the different types of services envisioned by 5G. This is because there is not a unique radio interface configuration able to fit all the different service requirements. As a consequence, 5G networks are being designed to support different radio interface configurations and mechanisms to multiplex these different services with different configurations in the same available spectrum. This concept is known as Network Slicing, and isa 5G key feature which needs to be supported end to end in the network (Radio Access, Transport and Core Network). In this way 5G Radio Access Networks (RAN) will add the resource allocation for different services problem to the user resource allocation traditional one. In this context, as both users and services scheduling is being left to vendor implementation by the standard, an extensive field of research is open. Different simulation tools have been developed for research purposes during the last years. However, as not so many of them are free, easy to use, and particularly none of the available ones supports Network Slicing at RAN level, this work presents a new simulator as its main contribution. Py5cheSim is a simple, flexible and open-source simulator based on Pythonand specially oriented to test different scheduling algorithms for 5G different types of services through a simple implementation of RAN Slicing feature. Its architecture allows to develop and integrate new scheduling algorithms in a easy and straight forward way. Furthermore, the use of Python provides enough versatility to even use Machine Learning tools for the development of new scheduling algorithms. The present work introduces the main 5G RAN design concepts which were taken as a baseline to develop the simulation tool. It also describes its design and implementation choices followed by the executed validation tests and its main results. Additionally this work presents a few use cases examples to show the developed tool’s potential providing a primary analysis of traditional scheduling algorithms for the new types of services envisioned by the technology. Finally it concludes about the developed tool contribution, the example results along with possible research lines and future versions improvements

    Cellular Planning and Optimization for 4G and 5G Mobile Networks

    Get PDF
    Cellular planning and optimization of mobile heterogeneous networks has been a topic of study for several decades with a diversity of resources, such as analytical formulations and simulation software being employed to characterize different scenarios with the aim of improving system capacity. Furthermore, the world has now witnessed the birth of the first commercial 5G New Radio networks with a technology that was developed to ensure the delivery of much higher data rates with comparably lower levels of latency. In the challenging scenarios of 4G and beyond, Carrier Aggregation has been proposed as a resource to allow enhancements in coverage and capacity. Another key element to ensure the success of 4G and 5G networks is the deployment of Small Cells to offload Macrocells. In this context, this MSc dissertation explores Small Cells deployment via an analytical formulation, where metrics such as Carrier plus Noise Interference Ratio, and physical and supported throughput are computed to evaluate the system´s capacity under different configurations regarding interferers positioning in a scenario where Spectrum Sharing is explored as a solution to deal with the scarcity of spectrum. One also uses the results of this analyses to propose a cost/revenue optimization where deployment costs are estimated and evaluated as well as the revenue considering the supported throughput obtained for the three frequency bands studied, i.e., 2.6 GHz, 3.5 GHz and 5.62 GHz. Results show that, for a project life time of 5 years, and prices for the traffic of order of 5€ per 1 GB, the system is profitable for all three frequency bands, for distances up to 1335 m. Carrier Aggregation is also investigated, in a scenario where the LTE-Sim packet level simulator is used to evaluate the use of this approach while considering the use of two frequency bands i.e., 2.6 GHz and 800 MHz to perform the aggregation with the scheduling of packets being performed via an integrated common radio resource management used to compute Packet Loss Ratio, delay and goodput under different scenarios of number of users and cell radius. Results of this analysis have been compared to a scenario without Carrier Aggregation and it has been demonstrated that CA is able to enhance capacity by reducing the levels of Packet Loss Ratio and delay, which in turn increases the achievable goodput.O planeamento e otimização de redes de redes celulares heterogéneas tem sido um tópico de investigação por várias décadas com diversas abordagens que incluem formulações analíticas e softwares de simulação, sendo aplicados na caracterização de diferentes cenários, com o objetivo de melhorar a capacidade de sistema. Além disso, o mundo testemunhou o nascimento das primeiras redes 5G New Radio, com uma tecnologia que foi desenvolvida com o objetivo de garantir taxas de transferência de dados muito superiores, com níveis de latência comparativamente inferiores. Neste cenário de desafios pós-4G, a agregação de Espectro tem sido proposta como uma solução para permitir melhorias na cobertura e capacidade do sistema. Outro ponto para garantir o sucesso das redes 5G é a utilização de Pequenas Células para descongestionar as Macro células. Neste contexto, esta dissertação de mestrado explora a utilização de Pequenas Células através de uma formulação analítica, onde se avaliam métricas como a relação portadora-interferência-mais-ruído, débito binário e débito binário suportado, sob diferentes configurações de posicionamento de interferentes em cenários onde a partilha de espectro é explorada como uma solução para enfrentar a escassez de espectro. Os resultados dessa análise são também considerados para propor uma otimização de custos/proveitos, onde os custos de implantação são estimados e avaliados, assim como os proveitos ao se considerar o débito binário suportado obtido para as três bandas de frequência em estudo, a saber, 2.6 GHz, 3.5 GHz e 5.62 GHz. Os resultados demonstram que, para um tempo de vida do projeto de 5 anos, e para preços de tráfego de cerca de 5 € por GB, o sistema é lucrativo para as três bandas de frequência, para distâncias até 1335 m. Também se investiga a agregação de espectro recorrendo ao simulador de pacotes LTE-Sim para avaliar o uso de duas bandas de frequência, a saber, 2.6 GHz e 800 MHz, considerando agregação com a calendarização de pacotes por meio de um gestor comum de recursos de rádio integrado, utilizado para computar a taxa de perda de pacotes, o atraso e o débito binário na camada de aplicação, em cenários com diferentes valores de número de utilizadores e raios das células. Os resultados dessa análise foram comparados com o desempenho de um cenário sem agregação. Foi demonstrado que a agregação é capaz de aumentar a capacidade de sistema, ao reduzir os níveis de perda de pacotes e do atraso, o que por sua vez possibilita a elevação dos níveis de débito binário atingidos

    4G/5G cellular networks metrology and management

    Get PDF
    La prolifération d'applications et de services sophistiqués s'accompagne de diverses exigences de performances, ainsi que d'une croissance exponentielle du trafic pour le lien montant (uplink) et descendant (downlink). Les réseaux cellulaires tels que 4G et 5G évoluent pour prendre en charge cette quantité diversifiée et énorme de données. Le travail de cette thèse vise le renforcement de techniques avancées de gestion et supervision des réseaux cellulaires prenant l'explosion du trafic et sa diversité comme deux des principaux défis dans ces réseaux. La première contribution aborde l'intégration de l'intelligence dans les réseaux cellulaires via l'estimation du débit instantané sur le lien montant pour de petites granularités temporelles. Un banc d'essai 4G temps réel est déployé dans ce but de fournir un benchmark exhaustif des métriques de l'eNB. Des estimations précises sont ainsi obtenues. La deuxième contribution renforce le découpage 5G en temps réel au niveau des ressources radio dans un système multicellulaire. Pour cela, deux modèles d'optimisation ont été proposés. Du fait de leurs temps d'exécution trop long, des heuristiques ont été développées et évaluées en comparaisons des modèles optimaux. Les résultats sont prometteurs, les deux heuristiques renforçant fortement le découpage du RAN en temps réel.The proliferation of sophisticated applications and services comes with diverse performance requirements as well as an exponential traffic growth for both upload and download. The cellular networks such as 4G and 5G are advocated to support this diverse and huge amount of data. This thesis work targets the enforcement of advanced cellular network supervision and management techniques taking the traffic explosion and diversity as two main challenges in these networks. The first contribution tackles the intelligence integration in cellular networks through the estimation of users uplink instantaneous throughput at small time granularities. A real time 4G testbed is deployed for such aim with an exhaustive metrics benchmark. Accurate estimations are achieved.The second contribution enforces the real time 5G slicing from radio resources perspective in a multi-cell system. For that, two exact optimization models are proposed. Due to their high convergence time, heuristics are developed and evaluated with the optimal models. Results are promising, as two heuristics are highly enforcing the real time RAN slicing

    Towards Scalable Design of Future Wireless Networks

    Full text link
    Wireless operators face an ever-growing challenge to meet the throughput and processing requirements of billions of devices that are getting connected. In current wireless networks, such as LTE and WiFi, these requirements are addressed by provisioning more resources: spectrum, transmitters, and baseband processors. However, this simple add-on approach to scale system performance is expensive and often results in resource underutilization. What are, then, the ways to efficiently scale the throughput and operational efficiency of these wireless networks? To answer this question, this thesis explores several potential designs: utilizing unlicensed spectrum to augment the bandwidth of a licensed network; coordinating transmitters to increase system throughput; and finally, centralizing wireless processing to reduce computing costs. First, we propose a solution that allows LTE, a licensed wireless standard, to co-exist with WiFi in the unlicensed spectrum. The proposed solution bridges the incompatibility between the fixed access of LTE, and the random access of WiFi, through channel reservation. It achieves a fair LTE-WiFi co-existence despite the transmission gaps and unequal frame durations. Second, we consider a system where different MIMO transmitters coordinate to transmit data of multiple users. We present an adaptive design of the channel feedback protocol that mitigates interference resulting from the imperfect channel information. Finally, we consider a Cloud-RAN architecture where a datacenter or a cloud resource processes wireless frames. We introduce a tree-based design for real-time transport of baseband samples and provide its end-to-end schedulability and capacity analysis. We also present a processing framework that combines real-time scheduling with fine-grained parallelism. The framework reduces processing times by migrating parallelizable tasks to idle compute resources, and thus, decreases the processing deadline-misses at no additional cost. We implement and evaluate the above solutions using software-radio platforms and off-the-shelf radios, and confirm their applicability in real-world settings.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133358/1/gkchai_1.pd

    Traffic Steering in Radio Level Integration of LTE and Wi-Fi Networks

    Get PDF
    A smartphone generates approximately 1, 614 MB of data per month which is 48 times of the data generated by a typical basic-feature cell phone. Cisco forecasts that the mobile data traffic growth will remain to increase and reach 49 Exabytes per month by 2021. However, the telecommunication service providers/operators face many challenges in order to improve cellular network capacity to match these ever-increasing data demands due to low, almost flat Average Revenue Per User (ARPU) and low Return on Investment (RoI). Spectrum resource crunch and licensing requirement for operation in cellular bands further complicate the procedure to support and manage the network. In order to deal with the aforementioned challenges, one of the most vital solutions is to leverage the integration benefits of cellular networks with unlicensed operation of Wi-Fi networks. A closer level of cellular and Wi-Fi coupling/interworking improves Quality of Service (QoS) by unified connection management to user devices (UEs). It also offloads a significant portion of user traffic from cellular Base Station (BS) to Wi-Fi Access Point (AP). In this thesis, we have considered the cellular network to be Long Term Evolution (LTE) popularly known as 4G-LTE for interworking with Wi-Fi. Third Generation Partnership Project (3GPP) defined various LTE and Wi-Fi interworking architectures from Rel-8 to Rel-11. Because of the limitations in these legacy LTE Wi-Fi interworking solutions, 3GPP proposed Radio Level Integration (RLI) architectures to enhance flow mobility and to react fast to channel dynamics. RLI node encompasses link level connection between Small cell deployments, (ii) Meeting Guaranteed Bit Rate (GBR) requirements of the users including those experiencing poor Signal to Interference plus Noise Ratio (SINR), and (iii) Dynamic steering of the flows across LTE and Wi-Fi links to maximize the system throughput. The second important problem addressed is the uplink traffic steering. To enable efficient uplink traffic steering in LWIP system, in this thesis, Network Coordination Function (NCF) is proposed. NCF is realized at the LWIP node by implementing various uplink traffic steering algorithms. NCF encompasses four different uplink traffic steering algorithms for efficient utilization of Wi-Fi resources in LWIP system. NCF facilitates the network to take intelligent decisions rather than individual UEs deciding to steer the uplink traffic onto LTE link or Wi-Fi link. The NCF algorithms work by leveraging the availability of LTE as the anchor to improvise the channel utilization of Wi-Fi. The third most important problem is to enable packet level steering in LWIP. When data rates of LTE and Wi-Fi links are incomparable, steering packets across the links create problems for TCP traffic. When the packets are received Out-of-Order (OOO) at the TCP receiver due to variation in delay experienced on each link, it leads to the generation of DUPlicate ACKnowledgements (DUP-ACK). These unnecessary DUP-ACKs adversely affect the TCP congestion window growth and thereby lead to poor TCP performance. This thesis addresses this problem by proposing a virtual congestion control mechanism (VIrtual congeStion control wIth Boost acknowLedgEment -VISIBLE). The proposed mechanism not only improves the throughput of a flow by reducing the number of unnecessary DUPACKs delivered to the TCP sender but also sends Boost ACKs in order to rapidly grow the congestion window to reap in aggregation benefits of heterogeneous links. The fourth problem considered is the placement of LWIP nodes. In this thesis, we have addressed problems pertaining to the dense deployment of LWIP nodes. LWIP deployment can be realized in colocated and non-colocated fashion. The placement of LWIP nodes is done with the following objectives: (i) Minimizing the number of LWIP nodes deployed without any coverage holes, (ii) Maximizing SINR in every sub-region of a building, and (iii) Minimizing the energy spent by UEs and LWIP nodes. Finally, prototypes of RLI architectures are presented (i.e., LWIP and LWA testbeds). The prototypes are developed using open source LTE platform OpenAirInterface (OAI) and commercial-off-the-shelf hardware components. The developed LWIP prototype is made to work with commercial UE (Nexus 5). The LWA prototype requires modification at the UE protocol stack, hence it is realized using OAI-UE. The developed prototypes are coupled with the legacy multipath protocol such as MPTCP to investigate the coupling benefits

    Software­based LTE Base Station Evaluations

    Get PDF
    Software Defined Radio (SDR) enables the execution of many hardware-based operations through programmable FPGA-based SDR systems. With the open-source OpenUMTS software and an ETTUS SDR system, we are able to run a UMTS base station on a portable and low-cost device.In the early 90s, the GSM standard was introduced in Europe with an important carrier-side investment in order to deliver a digital service to the home user. Back then there was no functionality bundling in contrast to the actual terminals. Call functionality, camera, video, GPS, Internet access, etc. were distributed through different devices. Currently, the prior service integration into a single mobile device, along with the technology diffusion to the overall population, involved a greater requirement on quality of service, bandwidth and coverage, forcing operators to optimize the resource usage with a long-term vision. One of the study objects within this long haul technological evolution is the infrastructure adaptation to accommodate the next generation of standards, which promise improved performance and efficiency. However, current operator infrastructure may not be designed to, nor prepared for this purpose, suggesting replacing them again. Given this need arise the concepts of SDR (Software Defined Radio) and C-RAN (Cloud Radio Access Network). The former is a versatile solution in the hardware area, while the latter a promising key to aid with network dimensioning and resource optimization. This project examines and evaluates the use of OpenAirInterface platform, a software solution based on SDR, to implement a LTE base station, with the possibility of splitting the different entities of the architecture (C-RAN), and using for this purpose low-cost portable devices.A principios de la década de los 90 se introdujo el estándar GSM en Europa, con una fuerte inversión por parte de las operadoras para ofrecer un servicio digital al usuario doméstico. Entonces no existían la agrupación de funcionalidades que hoy disfrutamos en los terminales. Las funcionalidades de llamadas, cámara, video, GPS, acceso a internet, etc. estaban distribuidas en diferentes dispositivos. Actualmente, la unificación de los aparatos anteriores en el terminal móvil y la introducción de la tecnología a gran parte de la población han implicado un mayor requerimiento en calidad de servicio, ancho de banda y cobertura, lo que obliga a las operadoras a optimizar la explotación de los diferentes servicios con una visión a largo plazo. Uno de los objetos de estudio de esta evolución tecnológica continuada en el tiempo es la adecuación de la infraestructura para adaptarse a los estándares de la próxima generación, los cuales prometen una mejora en rendimiento y eficiencia. Sin embargo, los equipos operativos que las operadoras poseen actualmente pueden no estar diseñados ni preparados para este fin, y ello implica la necesidad de volver a reemplazarlos. Ante esta necesidad nacen los conceptos de SDR (Software Defined Radio) y C-RAN (Cloud Radio Access Network). El primero es una solución para proporcionar versatilidad al equipo hardware, y el segundo para facilitar el dimensionado de la red y optimizar recursos según demanda. Este proyecto estudia y evalúa el uso de la plataforma OpenAirInterface, solución basada en SDR para implementar una estación base de telefonía LTE, con la posibilidad de separar las diferentes entidades de la arquitectura (C-RAN). Utilizando para ello dispositivos portátiles de bajo coste.In the early 90s, the GSM standard was introduced in Europe with an important carrier-side investment in order to deliver a digital service to the home user. Back then there was no functionality bundling in contrast to the actual terminals. Call functionality, camera, video, GPS, Internet access, etc. were distributed through different devices. Currently, the prior service integration into a single mobile device, along with the technology diffusion to the overall population, involved a greater requirement on quality of service, bandwidth and coverage, forcing operators to optimize the resource usage with a long-term vision. One of the study objects within this long haul technological evolution is the infrastructure adaptation to accommodate the next generation of standards, which promise improved performance and efficiency. However, current operator infrastructure may not be designed to, nor prepared for this purpose, suggesting replacing them again. Given this need arise the concepts of SDR (Software Defined Radio) and C-RAN (Cloud Radio Access Network). The former is a versatile solution in the hardware area, while the latter a promising key to aid with network dimensioning and resource optimization. This project examines and evaluates the use of OpenAirInterface platform, a software solution based on SDR, to implement a LTE base station, with the possibility of splitting the different entities of the architecture (C-RAN), and using for this purpose low-cost portable devices

    Towards a programmable and virtualized mobile radio access network architecture

    Get PDF
    Emerging 5G mobile networks are envisioned to become multi-service environments, enabling the dynamic deployment of services with a diverse set of performance requirements, accommodating the needs of mobile network operators, verticals and over-the-top service providers. The Radio Access Network (RAN) part of mobile networks is expected to play a very significant role towards this evolution. Unfortunately, such a vision cannot be efficiently supported by the conventional RAN architecture, which adopts a fixed and rigid design. For the network to evolve, flexibility in the creation, management and control of the RAN components is of paramount importance. The key elements that can allow us to attain this flexibility are the programmability and the virtualization of the network functions. While in the case of the mobile core, these issues have been extensively studied due to the advent of technologies like Software-Defined Networking (SDN) and Network Functions Virtualization (NFV) and the similarities that the core shares with other wired networks like data centers, research in the domain of the RAN is still in its infancy. The contributions made in this thesis significantly advance the state of the art in the domain of RAN programmability and virtualization in three dimensions. First, we design and implement a software-defined RAN (SD-RAN) platform called FlexRAN, that provides a flexible control plane designed with support for real-time RAN control applications, flexibility to realize various degrees of coordination among RAN infrastructure entities, and programmability to adapt control over time and easier evolution to the future following SDN/NFV principles. Second, we leverage the capabilities of the FlexRAN platform to design and implement Orion, which is a novel RAN slicing system that enables the dynamic on-the-fly virtualization of base stations, the flexible customization of slices to meet their respective service needs and which can be used in an end-to-end network slicing setting. Third, we focus on the use case of multi-tenancy in a neutral-host indoors small-cell environment, where we design Iris, a system that builds on the capabilities of FlexRAN and Orion and introduces a dynamic pricing mechanism for the efficient and flexible allocation of shared spectrum to the tenants. A number of additional use cases that highlight the benefits of the developed systems are also presented. The lessons learned through this research are summarized and a discussion is made on interesting topics for future work in this domain. The prototype systems presented in this thesis have been made publicly available and are being used by various research groups worldwide in the context of 5G research
    corecore