89 research outputs found

    QoE on media deliveriy in 5G environments

    Get PDF
    231 p.5G expandirá las redes móviles con un mayor ancho de banda, menor latencia y la capacidad de proveer conectividad de forma masiva y sin fallos. Los usuarios de servicios multimedia esperan una experiencia de reproducción multimedia fluida que se adapte de forma dinámica a los intereses del usuario y a su contexto de movilidad. Sin embargo, la red, adoptando una posición neutral, no ayuda a fortalecer los parámetros que inciden en la calidad de experiencia. En consecuencia, las soluciones diseñadas para realizar un envío de tráfico multimedia de forma dinámica y eficiente cobran un especial interés. Para mejorar la calidad de la experiencia de servicios multimedia en entornos 5G la investigación llevada a cabo en esta tesis ha diseñado un sistema múltiple, basado en cuatro contribuciones.El primer mecanismo, SaW, crea una granja elástica de recursos de computación que ejecutan tareas de análisis multimedia. Los resultados confirman la competitividad de este enfoque respecto a granjas de servidores. El segundo mecanismo, LAMB-DASH, elige la calidad en el reproductor multimedia con un diseño que requiere una baja complejidad de procesamiento. Las pruebas concluyen su habilidad para mejorar la estabilidad, consistencia y uniformidad de la calidad de experiencia entre los clientes que comparten una celda de red. El tercer mecanismo, MEC4FAIR, explota las capacidades 5G de analizar métricas del envío de los diferentes flujos. Los resultados muestran cómo habilita al servicio a coordinar a los diferentes clientes en la celda para mejorar la calidad del servicio. El cuarto mecanismo, CogNet, sirve para provisionar recursos de red y configurar una topología capaz de conmutar una demanda estimada y garantizar unas cotas de calidad del servicio. En este caso, los resultados arrojan una mayor precisión cuando la demanda de un servicio es mayor

    QoE on media deliveriy in 5G environments

    Get PDF
    231 p.5G expandirá las redes móviles con un mayor ancho de banda, menor latencia y la capacidad de proveer conectividad de forma masiva y sin fallos. Los usuarios de servicios multimedia esperan una experiencia de reproducción multimedia fluida que se adapte de forma dinámica a los intereses del usuario y a su contexto de movilidad. Sin embargo, la red, adoptando una posición neutral, no ayuda a fortalecer los parámetros que inciden en la calidad de experiencia. En consecuencia, las soluciones diseñadas para realizar un envío de tráfico multimedia de forma dinámica y eficiente cobran un especial interés. Para mejorar la calidad de la experiencia de servicios multimedia en entornos 5G la investigación llevada a cabo en esta tesis ha diseñado un sistema múltiple, basado en cuatro contribuciones.El primer mecanismo, SaW, crea una granja elástica de recursos de computación que ejecutan tareas de análisis multimedia. Los resultados confirman la competitividad de este enfoque respecto a granjas de servidores. El segundo mecanismo, LAMB-DASH, elige la calidad en el reproductor multimedia con un diseño que requiere una baja complejidad de procesamiento. Las pruebas concluyen su habilidad para mejorar la estabilidad, consistencia y uniformidad de la calidad de experiencia entre los clientes que comparten una celda de red. El tercer mecanismo, MEC4FAIR, explota las capacidades 5G de analizar métricas del envío de los diferentes flujos. Los resultados muestran cómo habilita al servicio a coordinar a los diferentes clientes en la celda para mejorar la calidad del servicio. El cuarto mecanismo, CogNet, sirve para provisionar recursos de red y configurar una topología capaz de conmutar una demanda estimada y garantizar unas cotas de calidad del servicio. En este caso, los resultados arrojan una mayor precisión cuando la demanda de un servicio es mayor

    Efficient Discovery and Utilization of Radio Information in Ultra-Dense Heterogeneous 3D Wireless Networks

    Get PDF
    Emergence of new applications, industrial automation and the explosive boost of smart concepts have led to an environment with rapidly increasing device densification and service diversification. This revolutionary upward trend has led the upcoming 6th-Generation (6G) and beyond communication systems to be globally available communication, computing and intelligent systems seamlessly connecting devices, services and infrastructure facilities. In this kind of environment, scarcity of radio resources would be upshot to an unimaginably high level compelling them to be very efficiently utilized. In this case, timely action is taken to deviate from approximate site-specific 2-Dimensional (2D) network concepts in radio resource utilization and network planning replacing them with more accurate 3-Dimensional (3D) network concepts while utilizing spatially distributed location-specific radio characteristics. Empowering this initiative, initially a framework is developed to accurately estimate the location-specific path loss parameters under dynamic environmental conditions in a 3D small cell (SC) heterogeneous networks (HetNets) facilitating efficient radio resource management schemes using crowdsensing data collection principle together with Linear Algebra (LA) and machine learning (ML) techniques. According to the results, the gradient descent technique is with the highest path loss parameter estimation accuracy which is over 98%. At a latter stage, receive signal power is calculated at a slightly extended 3D communication distances from the cluster boundaries based on already estimated propagation parameters with an accuracy of over 74% for certain distances. Coordination in both device-network and network-network interactions is also a critical factor in efficient radio resource utilization while meeting Quality of Service (QoS) requirements in heavily congested future 3D SCs HetNets. Then, overall communication performance enhancement through better utilization of spatially distributed opportunistic radio resources in a 3D SC is addressed with the device and network coordination, ML and Slotted-ALOHA principles together with scheduling, power control and access prioritization schemes. Within this solution, several communication related factors like 3D spatial positions and QoS requirements of the devices in two co-located networks operated in licensed band (LB) and unlicensed band (UB) are considered. To overcome the challenge of maintaining QoS under ongoing network densification and with limited radio resources cellular network traffic is offloaded to UB. Approximately, 70% better overall coordination efficiency is achieved at initial network access with the device network coordinated weighting factor based prioritization scheme powered with the Q-learning (QL) principle over conventional schemes. Subsequently, coverage information of nearby dense NR-Unlicensed (NR-U) base stations (BSs) is investigated for better allocation and utilization of common location-specific spatially distributed radio resources in UB. Firstly, the problem of determining the receive signal power at a given location due to a transmission done by a neighbor NR-U BS is addressed with a solution based on a deep regression neural network algorithm enabling to predict receive signal or interference power of a neighbor BS at a given location of a 3D cell. Subsequently, the problem of efficient radio resource management is considered while dynamically utilizing UB spectrum for NR-U transmissions through an algorithm based on the double Q-learning (DQL) principle and device collaboration. Over 200% faster algorithm convergence is achieved by the DQL based method over conventional solutions with estimated path loss parameters

    The role of artificial intelligence driven 5G networks in COVID-19 outbreak: opportunities, challenges, and future outlook

    Get PDF
    There is no doubt that the world is currently experiencing a global pandemic that is reshaping our daily lives as well as the way business activities are being conducted. With the emphasis on social distancing as an effective means of curbing the rapid spread of the infection, many individuals, institutions, and industries have had to rely on telecommunications as a means of ensuring service continuity in order to prevent complete shutdown of their operations. This has put enormous pressure on both fixed and mobile networks. Though fifth generation mobile networks (5G) is at its infancy in terms of deployment, it possesses a broad category of services including enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC), that can help in tackling pandemic-related challenges. Therefore, in this paper, we identify the challenges facing existing networks due to the surge in traffic demand as a result of the COVID-19 pandemic and emphasize the role of 5G empowered by artificial intelligence in tackling these problems. In addition, we also provide a brief insight on the use of artificial intelligence driven 5G networks in predicting future pandemic outbreaks, and the development a pandemic-resilient society in case of future outbreaks

    Addressing training data sparsity and interpretability challenges in AI based cellular networks

    Get PDF
    To meet the diverse and stringent communication requirements for emerging networks use cases, zero-touch arti cial intelligence (AI) based deep automation in cellular networks is envisioned. However, the full potential of AI in cellular networks remains hindered by two key challenges: (i) training data is not as freely available in cellular networks as in other fields where AI has made a profound impact and (ii) current AI models tend to have black box behavior making operators reluctant to entrust the operation of multibillion mission critical networks to a black box AI engine, which allow little insights and discovery of relationships between the configuration and optimization parameters and key performance indicators. This dissertation systematically addresses and proposes solutions to these two key problems faced by emerging networks. A framework towards addressing the training data sparsity challenge in cellular networks is developed, that can assist network operators and researchers in choosing the optimal data enrichment technique for different network scenarios, based on the available information. The framework encompasses classical interpolation techniques, like inverse distance weighted and kriging to more advanced ML-based methods, like transfer learning and generative adversarial networks, several new techniques, such as matrix completion theory and leveraging different types of network geometries, and simulators and testbeds, among others. The proposed framework will lead to more accurate ML models, that rely on sufficient amount of representative training data. Moreover, solutions are proposed to address the data sparsity challenge specifically in Minimization of drive test (MDT) based automation approaches. MDT allows coverage to be estimated at the base station by exploiting measurement reports gathered by the user equipment without the need for drive tests. Thus, MDT is a key enabling feature for data and artificial intelligence driven autonomous operation and optimization in current and emerging cellular networks. However, to date, the utility of MDT feature remains thwarted by issues such as sparsity of user reports and user positioning inaccuracy. For the first time, this dissertation reveals the existence of an optimal bin width for coverage estimation in the presence of inaccurate user positioning, scarcity of user reports and quantization error. The presented framework can enable network operators to configure the bin size for given positioning accuracy and user density that results in the most accurate MDT based coverage estimation. The lack of interpretability in AI-enabled networks is addressed by proposing a first of its kind novel neural network architecture leveraging analytical modeling, domain knowledge, big data and machine learning to turn black box machine learning models into more interpretable models. The proposed approach combines analytical modeling and domain knowledge to custom design machine learning models with the aim of moving towards interpretable machine learning models, that not only require a lesser training time, but can also deal with issues such as sparsity of training data and determination of model hyperparameters. The approach is tested using both simulated data and real data and results show that the proposed approach outperforms existing mathematical models, while also remaining interpretable when compared with black-box ML models. Thus, the proposed approach can be used to derive better mathematical models of complex systems. The findings from this dissertation can help solve the challenges in emerging AI-based cellular networks and thus aid in their design, operation and optimization
    corecore