484 research outputs found

    On the Throughput of Large-but-Finite MIMO Networks using Schedulers

    Full text link
    This paper studies the sum throughput of the {multi-user} multiple-input-single-output (MISO) networks in the cases with large but finite number of transmit antennas and users. Considering continuous and bursty communication scenarios with different users' data request probabilities, we derive quasi-closed-form expressions for the maximum achievable throughput of the networks using optimal schedulers. The results are obtained in various cases with different levels of interference cancellation. Also, we develop an efficient scheduling scheme using genetic algorithms (GAs), and evaluate the effect of different parameters, such as channel/precoding models, number of antennas/users, scheduling costs and power amplifiers' efficiency, on the system performance. Finally, we use the recent results on the achievable rates of finite block-length codes to analyze the system performance in the cases with short packets. As demonstrated, the proposed GA-based scheduler reaches (almost) the same throughput as in the exhaustive search-based optimal scheduler, with substantially less implementation complexity. Moreover, the power amplifiers' inefficiency and the scheduling delay affect the performance of the scheduling-based systems significantly

    Scheduling for next generation WLANs: filling the gap between offered and observed data rates

    Get PDF
    In wireless networks, opportunistic scheduling is used to increase system throughput by exploiting multi-user diversity. Although recent advances have increased physical layer data rates supported in wireless local area networks (WLANs), actual throughput realized are significantly lower due to overhead. Accordingly, the frame aggregation concept is used in next generation WLANs to improve efficiency. However, with frame aggregation, traditional opportunistic schemes are no longer optimal. In this paper, we propose schedulers that take queue and channel conditions into account jointly, to maximize throughput observed at the users for next generation WLANs. We also extend this work to design two schedulers that perform block scheduling for maximizing network throughput over multiple transmission sequences. For these schedulers, which make decisions over long time durations, we model the system using queueing theory and determine users' temporal access proportions according to this model. Through detailed simulations, we show that all our proposed algorithms offer significant throughput improvement, better fairness, and much lower delay compared with traditional opportunistic schedulers, facilitating the practical use of the evolving standard for next generation wireless networks

    Proportional Fair MU-MIMO in 802.11 WLANs

    Get PDF
    We consider the proportional fair rate allocation in an 802.11 WLAN that supports multi-user MIMO (MU-MIMO) transmission by one or more stations. We characterise, for the first time, the proportional fair allocation of MU-MIMO spatial streams and station transmission opportunities. While a number of features carry over from the case without MU-MIMO, in general neither flows nor stations need to be allocated equal airtime when MU-MIMO is available

    An open source multi-slice cell capacity framework

    Get PDF
    Número especial con los mejores papers de 2021.5G is the new 3GPP technology designed to solve a wide range of requirements. On the one hand, it must be able to support high bit rates and ultra-low latency services, and on the other hand, it should be able to connect a massive amount of devices with loose bandwidth and delay requirements. Network Slicing is a key paradigm in 5G, and future 6G networks will inherit it for the concurrent provisioning of diverse quality of service. As scheduling is always a delicate vendor topic and there are few free and complete simulation tools to support all 5G features, in this paper, we present Py5cheSim. This is a flexible and open-source simulator based on Python and specially oriented to simulate cell capacity in 3GPP 5G networks and beyond. To the best of our knowledge, Py5cheSim is the first simulator that supports Network Slicing at the Radio Access Network level. It offers an environment that allows the development of new scheduling algorithms in a researcher-friendly way without the need of detailed knowledge of the core of the tool. The present work describes its design and implementation choices, the validation process, the results and different use cases.Proyecto: FVF-2021-128– DICYT. Fondo Carlos Vaz Ferreira, Convocatoria 2021, Dirección Nacional de Innovación, Ciencia y Tecnología, Ministerio de Educación y Cultura, UruguayProyecto: FMV_1_2019_1_155700 "Inteligencia Artificial aplicada a redes 5G", Agencia Nacional de Investigación e Innovación, Urugua

    A Genetic Algorithm-based Beamforming Approach for Delay-constrained Networks

    Get PDF
    In this paper, we study the performance of initial access beamforming schemes in the cases with large but finite number of transmit antennas and users. Particularly, we develop an efficient beamforming scheme using genetic algorithms. Moreover, taking the millimeter wave communication characteristics and different metrics into account, we investigate the effect of various parameters such as number of antennas/receivers, beamforming resolution as well as hardware impairments on the system performance. As shown, our proposed algorithm is generic in the sense that it can be effectively applied with different channel models, metrics and beamforming methods. Also, our results indicate that the proposed scheme can reach (almost) the same end-to-end throughput as the exhaustive search-based optimal approach with considerably less implementation complexity

    Scheduling in 5G networks : Developing a 5G cell capacity simulator.

    Get PDF
    La quinta generación de comunicaciones móviles (5G) se está convirtiendo en una realidad gracias a la nueva tecnología 3GPP (3rd Generation Partnership Project) diseñada para cumplir con una amplia gama de requerimientos. Por un lado, debe poder soportar altas velocidades y servicios de latencia ultra-baja, y por otro lado, debe poder conectar una gran cantidad de dispositivos con requerimientos laxos de ancho de banda y retardo. Esta diversidad de requerimientos de servicio exige un alto grado de flexibilidad en el diseño de la interfaz de radio. Dado que la tecnología LTE (Long Term Evolution) se diseñó originalmente teniendo en cuenta la evolución de los servicios de banda ancha móvil, no proporciona suficiente flexibilidad para multiplexar de manera óptima los diferentes tipos de servicios previstos por 5G. Esto se debe a que no existe una única configuración de interfaz de radio capaz de adaptarse a todos los diferentes requisitos de servicio. Como consecuencia, las redes 5G se están diseñando para admitir diferentes configuraciones de interfaz de radio y mecanismos para multiplexar estos diferentes servicios con diferentes configuraciones en el mismo espectro disponible. Este concepto se conoce como Network Slicing y es una característica clave de 5G que debe ser soportada extremo a extremo en la red (acceso, transporte y núcleo). De esta manera, las Redes de Acceso (RAN) 5G agregarán el problema de asignación de recursos para diferentes servicios al problema tradicional de asignación de recursos a distintos usuarios. En este contexto, como el estándar no describe cómo debe ser la asignación de recursos para usuarios y servicios (quedando libre a la implementación de los proveedores) se abre un amplio campo de investigación. Se han desarrollado diferentes herramientas de simulación con fines de investigación durante los últimos años. Sin embargo, como no muchas de estas son libres, fáciles de usar y particularmente ninguna de las disponibles soporta Network Slicing a nivel de Red de Acceso, este trabajo presenta un nuevo simulador como principal contribución. Py5cheSim es un simulador simple, flexible y de código abierto basado en Python y especialmente orientado a probar diferentes algoritmos de scheduling para diferentes tipos de servicios 5G mediante una implementación simple de la funcionalidad RAN Slicing. Su arquitectura permite desarrollar e integrar nuevos algoritmos para asignación de recursos de forma sencilla y directa. Además, el uso de Python proporciona suficiente versatilidad para incluso utilizar herramientas de Inteligencia Artificial para el desarrollo de nuevos algoritmos. Este trabajo presenta los principales conceptos de diseño de las redes de acceso 5G que se tomaron como base para desarrollar la herramienta de simulación. También describe decisiones de diseño e implementación, seguidas de las pruebas de validación ejecutadas y sus principales resultados. Se presentan además algunos ejemplos de casos de uso para mostrar el potencial de la herramienta desarrollada, proporcionando un análisis primario de los algoritmos tradicionales de asignación de recursos para los nuevos tipos de servicios previstos por la tecnología. Finalmente se concluye sobre la contribución de la herramienta desarrollada, los resultados de los ejemplos incluyendo posibles líneas de investigación junto con posibles mejoras para futuras versiones.The fifth generation of mobile communications (5G) is already becoming a reality by the new 3GPP (3rd Generation Partnership Project) technology designed to solve a wide range of requirements. On the one hand, it must be able to support high bit rates and ultra-low latency services, and on the other hand, it should be able to connect a massive amount of devices with loose band width and delay requirements. Such diversity in terms of service requirements demands a high degree of flexibility in radio interface design. As LTE (Long Term Evolution) technology was originally designed with Mobile Broadband (MBB) services evolution in mind it does not provide enough flexibility to multiplex optimally the different types of services envisioned by 5G. This is because there is not a unique radio interface configuration able to fit all the different service requirements. As a consequence, 5G networks are being designed to support different radio interface configurations and mechanisms to multiplex these different services with different configurations in the same available spectrum. This concept is known as Network Slicing, and isa 5G key feature which needs to be supported end to end in the network (Radio Access, Transport and Core Network). In this way 5G Radio Access Networks (RAN) will add the resource allocation for different services problem to the user resource allocation traditional one. In this context, as both users and services scheduling is being left to vendor implementation by the standard, an extensive field of research is open. Different simulation tools have been developed for research purposes during the last years. However, as not so many of them are free, easy to use, and particularly none of the available ones supports Network Slicing at RAN level, this work presents a new simulator as its main contribution. Py5cheSim is a simple, flexible and open-source simulator based on Pythonand specially oriented to test different scheduling algorithms for 5G different types of services through a simple implementation of RAN Slicing feature. Its architecture allows to develop and integrate new scheduling algorithms in a easy and straight forward way. Furthermore, the use of Python provides enough versatility to even use Machine Learning tools for the development of new scheduling algorithms. The present work introduces the main 5G RAN design concepts which were taken as a baseline to develop the simulation tool. It also describes its design and implementation choices followed by the executed validation tests and its main results. Additionally this work presents a few use cases examples to show the developed tool’s potential providing a primary analysis of traditional scheduling algorithms for the new types of services envisioned by the technology. Finally it concludes about the developed tool contribution, the example results along with possible research lines and future versions improvements
    corecore