94 research outputs found

    Deep Reinforcement Learning for Scheduling and Power Allocation in a 5G Urban Mesh

    Full text link
    We study the problem of routing and scheduling of real-time flows over a multi-hop millimeter wave (mmWave) mesh. We develop a model-free deep reinforcement learning algorithm that determines which subset of the mmWave links should be activated during each time slot and using what power level. The proposed algorithm, called Adaptive Activator RL (AARL), can handle a variety of network topologies, network loads, and interference models, as well as adapt to different workloads. We demonstrate the operation of AARL on several topologies: a small topology with 10 links, a moderately-sized mesh with 48 links, and a large topology with 96 links. For each topology, the results of AARL are compared to those of a greedy scheduling algorithm. AARL is shown to outperform the greedy algorithm in two aspects. First, its schedule obtains higher goodput. Second, and even more importantly, while the run time of the greedy algorithm renders it impractical for real-time scheduling, the run time of AARL is suitable for meeting the time constraints of typical 5G networks

    End-to-End Simulation of 5G mmWave Networks

    Full text link
    Due to its potential for multi-gigabit and low latency wireless links, millimeter wave (mmWave) technology is expected to play a central role in 5th generation cellular systems. While there has been considerable progress in understanding the mmWave physical layer, innovations will be required at all layers of the protocol stack, in both the access and the core network. Discrete-event network simulation is essential for end-to-end, cross-layer research and development. This paper provides a tutorial on a recently developed full-stack mmWave module integrated into the widely used open-source ns--3 simulator. The module includes a number of detailed statistical channel models as well as the ability to incorporate real measurements or ray-tracing data. The Physical (PHY) and Medium Access Control (MAC) layers are modular and highly customizable, making it easy to integrate algorithms or compare Orthogonal Frequency Division Multiplexing (OFDM) numerologies, for example. The module is interfaced with the core network of the ns--3 Long Term Evolution (LTE) module for full-stack simulations of end-to-end connectivity, and advanced architectural features, such as dual-connectivity, are also available. To facilitate the understanding of the module, and verify its correct functioning, we provide several examples that show the performance of the custom mmWave stack as well as custom congestion control algorithms designed specifically for efficient utilization of the mmWave channel.Comment: 25 pages, 16 figures, submitted to IEEE Communications Surveys and Tutorials (revised Jan. 2018

    Counter Waves Link Activation Policy for Latency Control in In-Band IAB Systems

    Get PDF
    3GPP’s Integrated Access and Backhaul (IAB) architecture is expected to deliver a cost-efficient option for deploying 5G New Radio (NR) systems. However, IAB relies on multi-hop wireless communications, and packet latency therefore becomes a critical metric in such systems. Latency minimization in the in-band backhauling regime involves dynamical scheduling of active transmission links so as to avoid half-duplex conflicts, which brings significant control overheads. In this paper, by using the formalism of Markov decision processes (MDP), we identify a general fixed link activation policy and the associated policy design algorithm for tree-shaped in-band IAB systems with half-duplex constraints. The proposed policy, named “counter waves”, does not require signalling between the IAB donor and nodes and provides stable low latency for low-to-medium traffic conditions spanning up to 60% of the capacity region of the system.Peer reviewe

    Optimization of Handover, Survivability, Multi-Connectivity and Secure Slicing in 5G Cellular Networks using Matrix Exponential Models and Machine Learning

    Get PDF
    Title from PDF of title page, viewed January 31, 2023Dissertation advisor: Cory BeardVitaIncludes bibliographical references (pages 173-194)Dissertation (Ph.D.)--Department of Computer Science and Electrical Engineering. University of Missouri--Kansas City, 2022This works proposes optimization of cellular handovers, cellular network survivability modeling, multi-connectivity and secure network slicing using matrix exponentials and machine learning techniques. We propose matrix exponential (ME) modeling of handover arrivals with the potential to much more accurately characterize arrivals and prioritize resource allocation for handovers, especially handovers for emergency or public safety needs. With the use of a ‘B’ matrix for representing a handover arrival, we have a rich set of dimensions to model system handover behavior. We can study multiple parameters and the interactions between system events along with the user mobility, which would trigger a handoff in any given scenario. Additionally, unlike any traditional handover improvement scheme, we develop a ‘Deep-Mobility’ model by implementing a deep learning neural network (DLNN) to manage network mobility, utilizing in-network deep learning and prediction. We use the radio and the network key performance indicators (KPIs) to train our model to analyze network traffic and handover requirements. Cellular network design must incorporate disaster response, recovery and repair scenarios. Requirements for high reliability and low latency often fail to incorporate network survivability for mission critical and emergency services. Our Matrix Exponential (ME) model shows how survivable networks can be designed based on controlling numbers of crews, times taken for individual repair stages, and the balance between fast and slow repairs. Transient and the steady state representations of system repair models, namely, fast and slow repairs for networks consisting of multiple repair crews have been analyzed. Failures are exponentially modeled as per common practice, but ME distributions describe the more complex recovery processes. In some mission critical communications, the availability requirements may exceed five or even six nines (99.9999%). To meet such a critical requirement and minimize the impact of mobility during handover, a Fade Duration Outage Probability (FDOP) based multiple radio link connectivity handover method has been proposed. By applying such a method, a high degree of availability can be achieved by utilizing two or more uncorrelated links based on minimum FDOP values. Packet duplication (PD) via multi-connectivity is a method of compensating for lost packets on a wireless channel. Utilizing two or more uncorrelated links, a high degree of availability can be attained with this strategy. However, complete packet duplication is inefficient and frequently unnecessary. We provide a novel adaptive fractional packet duplication (A-FPD) mechanism for enabling and disabling packet duplication based on a variety of parameters. We have developed a ‘DeepSlice’ model by implementing Deep Learning (DL) Neural Network to manage network load efficiency and network availability, utilizing in-network deep learning and prediction. Our Neural Network based ‘Secure5G’ Network Slicing model will proactively detect and eliminate threats based on incoming connections before they infest the 5G core network elements. These will enable the network operators to sell network slicing as-a-service to serve diverse services efficiently over a single infrastructure with higher level of security and reliability.Introduction -- Matrix exponential and deep learning neural network modeling of cellular handovers -- Survivability modeling in cellular networks -- Multi connectivity based handover enhancement and adaptive fractional packet duplication in 5G cellular networks -- Deepslice and Secure5G: a deep learning framework towards an efficient, reliable and secure network slicing in 5G networks -- Conclusion and future scop

    Fast-RAT Scheduling in a 5G Multi-RAT Scenario

    Get PDF
    The authors exploit a Fast RAT switch solution to improve QoS metrics of the system by means of efficient RAT scheduling. Analyses presented here show a better understanding concerning which system measurements are most efficient in a mutliple-RAT scenario. More specifically, they present an analysis concerning the metrics that should be used as RAT scheduling criteria and how frequent these switching evaluations should be done

    Spectrum Sharing, Latency, and Security in 5G Networks with Application to IoT and Smart Grid

    Get PDF
    The surge of mobile devices, such as smartphones, and tables, demands additional capacity. On the other hand, Internet-of-Things (IoT) and smart grid, which connects numerous sensors, devices, and machines require ubiquitous connectivity and data security. Additionally, some use cases, such as automated manufacturing process, automated transportation, and smart grid, require latency as low as 1 ms, and reliability as high as 99.99\%. To enhance throughput and support massive connectivity, sharing of the unlicensed spectrum (3.5 GHz, 5GHz, and mmWave) is a potential solution. On the other hand, to address the latency, drastic changes in the network architecture is required. The fifth generation (5G) cellular networks will embrace the spectrum sharing and network architecture modifications to address the throughput enhancement, massive connectivity, and low latency. To utilize the unlicensed spectrum, we propose a fixed duty cycle based coexistence of LTE and WiFi, in which the duty cycle of LTE transmission can be adjusted based on the amount of data. In the second approach, a multi-arm bandit learning based coexistence of LTE and WiFi has been developed. The duty cycle of transmission and downlink power are adapted through the exploration and exploitation. This approach improves the aggregated capacity by 33\%, along with cell edge and energy efficiency enhancement. We also investigate the performance of LTE and ZigBee coexistence using smart grid as a scenario. In case of low latency, we summarize the existing works into three domains in the context of 5G networks: core, radio and caching networks. Along with this, fundamental constraints for achieving low latency are identified followed by a general overview of exemplary 5G networks. Besides that, a loop-free, low latency and local-decision based routing protocol is derived in the context of smart grid. This approach ensures low latency and reliable data communication for stationary devices. To address data security in wireless communication, we introduce a geo-location based data encryption, along with node authentication by k-nearest neighbor algorithm. In the second approach, node authentication by the support vector machine, along with public-private key management, is proposed. Both approaches ensure data security without increasing the packet overhead compared to the existing approaches

    A backhaul adaptation scheme for IAB networks using deep reinforcement learning with recursive discrete choice model

    Get PDF
    Challenges such as backhaul availability and backhaul scalability have continued to outweigh the progress of integrated access and backhaul (IAB) networks that enable multi-hop backhauling in 5G networks. These challenges, which are predominant in poor wireless channel conditions such as foliage, may lead to high energy consumption and packet losses. It is essential that the IAB topology enables efficient traffic flow by minimizing congestion and increasing robustness to backhaul failure. This article proposes a backhaul adaptation scheme that is controlled by the load on the access side of the network. The routing problem is formulated as a constrained Markov decision process and solved using a dual decomposition approach due to the existence of explicit and implicit constraints. A deep reinforcement learning (DRL) strategy that takes advantage of a recursive discrete choice model (RDCM) was proposed and implemented in a knowledge-defined networking architecture of an IAB network. The incorporation of the RDCM was shown to improve robustness to backhaul failure in IAB networks. The performance of the proposed algorithm was compared to that of conventional DRL, i.e., without RDCM, and generative model-based learning (GMBL) algorithms. The simulation results of the proposed approach reveal risk perception by introducing certain biases on alternative choices and the results showed that the proposed algorithm provides better throughput and delay performance over the two baselines.The Sentech Chair in Broadband Wireless Multimedia Communications (BWMC) and the University of Pretoria.https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639Electrical, Electronic and Computer Engineerin

    A review on green caching strategies for next generation communication networks

    Get PDF
    © 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching
    • 

    corecore