1,377 research outputs found

    Adaptive Data-driven Optimization using Transfer Learning for Resilient, Energy-efficient, Resource-aware, and Secure Network Slicing in 5G-Advanced and 6G Wireless Systems

    Get PDF
    Title from PDF of title page, viewed January 31, 2023Dissertation advisor: Cory BeardVitaIncludes bibliographical references (pages 134-141)Dissertation (Ph.D)--Department of Computer Science and Electrical Engineering. University of Missouri--Kansas City, 20225G–Advanced is the next step in the evolution of the fifth–generation (5G) technology. It will introduce a new level of expanded capabilities beyond connections and enables a broader range of advanced applications and use cases. 5G–Advanced will support modern applications with greater mobility and high dependability. Artificial intelligence and Machine Learning will enhance network performance with spectral efficiency and energy savings enhancements. This research established a framework to optimally control and manage an appropriate selection of network slices for incoming requests from diverse applications and services in Beyond 5G networks. The developed DeepSlice model is used to optimize the network and individual slice load efficiency across isolated slices and manage slice lifecycle in case of failure. The DeepSlice framework can predict the unknown connections by utilizing the learning from a developed deep-learning neural network model. The research also addresses threats to the performance, availability, and robustness of B5G networks by proactively preventing and resolving threats. The study proposed a Secure5G framework for authentication, authorization, trust, and control for a network slicing architecture in 5G systems. The developed model prevents the 5G infrastructure from Distributed Denial of Service by analyzing incoming connections and learning from the developed model. The research demonstrates the preventive measure against volume attacks, flooding attacks, and masking (spoofing) attacks. This research builds the framework towards the zero trust objective (never trust, always verify, and verify continuously) that improves resilience. Another fundamental difficulty for wireless network systems is providing a desirable user experience in various network conditions, such as those with varying network loads and bandwidth fluctuations. Mobile Network Operators have long battled unforeseen network traffic events. This research proposed ADAPTIVE6G to tackle the network load estimation problem using knowledge-inspired Transfer Learning by utilizing radio network Key Performance Indicators from network slices to understand and learn network load estimation problems. These algorithms enable Mobile Network Operators to optimally coordinate their computational tasks in stochastic and time-varying network states. Energy efficiency is another significant KPI in tracking the sustainability of network slicing. Increasing traffic demands in 5G dramatically increase the energy consumption of mobile networks. This increase is unsustainable in terms of dollar cost and environmental impact. This research proposed an innovative ECO6G model to attain sustainability and energy efficiency. Research findings suggested that the developed model can reduce network energy costs without negatively impacting performance or end customer experience against the classical Machine Learning and Statistical driven models. The proposed model is validated against the industry-standardized energy efficiency definition, and operational expenditure savings are derived, showing significant cost savings to MNOs.Introduction -- A deep neural network framework towards a resilient, efficient, and secure network slicing in Beyond 5G Networks -- Adaptive resource management techniques for network slicing in Beyond 5G networks using transfer learning -- Energy and cost analysis for network slicing deployment in Beyond 5G networks -- Conclusion and future scop

    Multi-objective resource optimization in space-aerial-ground-sea integrated networks

    Get PDF
    Space-air-ground-sea integrated (SAGSI) networks are envisioned to connect satellite, aerial, ground, and sea networks to provide connectivity everywhere and all the time in sixth-generation (6G) networks. However, the success of SAGSI networks is constrained by several challenges including resource optimization when the users have diverse requirements and applications. We present a comprehensive review of SAGSI networks from a resource optimization perspective. We discuss use case scenarios and possible applications of SAGSI networks. The resource optimization discussion considers the challenges associated with SAGSI networks. In our review, we categorized resource optimization techniques based on throughput and capacity maximization, delay minimization, energy consumption, task offloading, task scheduling, resource allocation or utilization, network operation cost, outage probability, and the average age of information, joint optimization (data rate difference, storage or caching, CPU cycle frequency), the overall performance of network and performance degradation, software-defined networking, and intelligent surveillance and relay communication. We then formulate a mathematical framework for maximizing energy efficiency, resource utilization, and user association. We optimize user association while satisfying the constraints of transmit power, data rate, and user association with priority. The binary decision variable is used to associate users with system resources. Since the decision variable is binary and constraints are linear, the formulated problem is a binary linear programming problem. Based on our formulated framework, we simulate and analyze the performance of three different algorithms (branch and bound algorithm, interior point method, and barrier simplex algorithm) and compare the results. Simulation results show that the branch and bound algorithm shows the best results, so this is our benchmark algorithm. The complexity of branch and bound increases exponentially as the number of users and stations increases in the SAGSI network. We got comparable results for the interior point method and barrier simplex algorithm to the benchmark algorithm with low complexity. Finally, we discuss future research directions and challenges of resource optimization in SAGSI networks

    Investigating the Effects of Network Dynamics on Quality of Delivery Prediction and Monitoring for Video Delivery Networks

    Get PDF
    Video streaming over the Internet requires an optimized delivery system given the advances in network architecture, for example, Software Defined Networks. Machine Learning (ML) models have been deployed in an attempt to predict the quality of the video streams. Some of these efforts have considered the prediction of Quality of Delivery (QoD) metrics of the video stream in an effort to measure the quality of the video stream from the network perspective. In most cases, these models have either treated the ML algorithms as black-boxes or failed to capture the network dynamics of the associated video streams. This PhD investigates the effects of network dynamics in QoD prediction using ML techniques. The hypothesis that this thesis investigates is that ML techniques that model the underlying network dynamics achieve accurate QoD and video quality predictions and measurements. The thesis results demonstrate that the proposed techniques offer performance gains over approaches that fail to consider network dynamics. This thesis results highlight that adopting the correct model by modelling the dynamics of the network infrastructure is crucial to the accuracy of the ML predictions. These results are significant as they demonstrate that improved performance is achieved at no additional computational or storage cost. These techniques can help the network manager, data center operatives and video service providers take proactive and corrective actions for improved network efficiency and effectiveness

    Horizontally distributed inference of deep neural networks for AI-enabled IoT

    Get PDF
    Motivated by the pervasiveness of artificial intelligence (AI) and the Internet of Things (IoT) in the current “smart everything” scenario, this article provides a comprehensive overview of the most recent research at the intersection of both domains, focusing on the design and development of specific mechanisms for enabling a collaborative inference across edge devices towards the in situ execution of highly complex state-of-the-art deep neural networks (DNNs), despite the resource-constrained nature of such infrastructures. In particular, the review discusses the most salient approaches conceived along those lines, elaborating on the specificities of the partitioning schemes and the parallelism paradigms explored, providing an organized and schematic discussion of the underlying workflows and associated communication patterns, as well as the architectural aspects of the DNNs that have driven the design of such techniques, while also highlighting both the primary challenges encountered at the design and operational levels and the specific adjustments or enhancements explored in response to them.Agencia Estatal de Investigación | Ref. DPI2017-87494-RMinisterio de Ciencia e Innovación | Ref. PDC2021-121644-I00Xunta de Galicia | Ref. ED431C 2022/03-GR

    Evolution of High Throughput Satellite Systems: Vision, Requirements, and Key Technologies

    Full text link
    High throughput satellites (HTS), with their digital payload technology, are expected to play a key role as enablers of the upcoming 6G networks. HTS are mainly designed to provide higher data rates and capacities. Fueled by technological advancements including beamforming, advanced modulation techniques, reconfigurable phased array technologies, and electronically steerable antennas, HTS have emerged as a fundamental component for future network generation. This paper offers a comprehensive state-of-the-art of HTS systems, with a focus on standardization, patents, channel multiple access techniques, routing, load balancing, and the role of software-defined networking (SDN). In addition, we provide a vision for next-satellite systems that we named as extremely-HTS (EHTS) toward autonomous satellites supported by the main requirements and key technologies expected for these systems. The EHTS system will be designed such that it maximizes spectrum reuse and data rates, and flexibly steers the capacity to satisfy user demand. We introduce a novel architecture for future regenerative payloads while summarizing the challenges imposed by this architecture

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Reinforcement Learning Empowered Unmanned Aerial Vehicle Assisted Internet of Things Networks

    Get PDF
    This thesis aims towards performance enhancement for unmanned aerial vehicles (UAVs) assisted internet of things network (IoT). In this realm, novel reinforcement learning (RL) frameworks have been proposed for solving intricate joint optimisation scenarios. These scenarios include, uplink, downlink and combined. The multi-access technique utilised is non-orthogonal multiple access (NOMA), as key enabler in this regime. The outcomes of this research entail, enhancement in key performance metrics, such as sum-rate, energy efficiency and consequent reduction in outage. For the scenarios involving uplink transmissions by IoT devices, adaptive and tandem rein forcement learning frameworks have been developed. The aim is to maximise capacity over fixed UAV trajectory. The adaptive framework is utilised in a scenario wherein channel suitability is ascertained for uplink transmissions utilising a fixed clustering regime in NOMA. Tandem framework is utilised in a scenario wherein multiple-channel resource suitability is ascertained along with, power allocation, dynamic clustering and IoT node associations to NOMA clusters and channels. In scenarios involving downlink transmission to IoT devices, an ensemble RL (ERL) frame work is proposed for sum-rate enhancement over fixed UAV trajectory. For dynamic UAV trajec tory, hybrid decision framework (HDF) is proposed for energy efficiency optimisation. Downlink transmission power and bandwidth is managed for NOMA transmissions over fixed and dynamic UAV trajectories, facilitating IoT networks. In UAV enabled relaying scenario, for control system plants and their respective remotely deployed sensors, a head start reinforcement learning framework based on deep learning is de veloped and implemented. NOMA is invoked, in both uplink and downlink transmissions for IoT network. Dynamic NOMA clustering, power management and nodes association along with UAV height control is jointly managed. The primary aim is the, enhancement of net sum-rate and its subsequent manifestation in facilitating the IoT assisted use case. The simulation results relating to aforesaid scenarios indicate, enhanced sum-rate, energy efficiency and reduced outage for UAV-assisted IoT networks. The proposed RL frameworks surpass in performance in comparison to existing frameworks as benchmarks for the same sce narios. The simulation platforms utilised are MATLAB and Python, for network modeling, RL framework design and validation
    • …
    corecore