10 research outputs found

    Joint Satellite Gateway Placement and Routing for Integrated Satellite-Terrestrial Networks

    Full text link
    With the increasing attention to the integrated satellite-terrestrial networks (ISTNs), the satellite gateway placement problem becomes of paramount importance. The resulting network performance may vary depending on the different design strategies. In this paper, a joint satellite gateway placement and routing strategy for the terrestrial network is proposed to minimize the overall cost of gateway deployment and traffic routing, while adhering to the average delay requirement for traffic demands. Although traffic routing and gateway placement can be solved independently, the dependence between the routing decisions for different demands makes it more realistic to solve an aggregated model instead. We develop a mixed-integer linear program (MILP) formulation for the problem. We relax the integrality constraints to achieve a linear program (LP) which reduces time-complexity at the expense of a sub-optimal solution. We further propose a variant of the proposed model to balance the load between the selected gateways.Comment: 6 pages, In Proceedings of IEEE ICC 2020. https://ieeexplore.ieee.org/document/9149175 N. Torkzaban, A. Gholami, J. S. Baras and C. Papagianni, "Joint Satellite Gateway Placement and Routing for Integrated Satellite-Terrestrial Networks," ICC 2020 - 2020 IEEE International Conference on Communications (ICC), Dublin, Ireland, 2020, pp. 1-6. doi: 10.1109/ICC40277.2020.914917

    Optimal placement of User Plane Functions in 5G networks

    Get PDF
    Because of developments in society and technology, new services and use cases have emerged, such as vehicle-to-everything communication and smart manufacturing. Some of these services have stringent requirements in terms of reliability, bandwidth, and network response time and to meet them, deploying network functions (NFs) closer to users is necessary. Doing so will lead to an increase in costs and the number of NFs. Under such circumstances, the use of optimization strategies for the placement of NFs is crucial to offer Quality of Service (QoS) in a cost-effective manner. In this vein, this paper addresses the User Plane Functions Placement (UPFP) problem in 5G networks. The UPFP is modeled as a Mixed-Integer Linear Programming (MILP) problem aimed at determining the optimal number and location of User Plane Functions (UPFs). Two optimization models are proposed that considered various parameters, such as latency, reliability and user mobility. To evaluate their performance, two services under the Ultra-Reliable an Low-Latency Communication (URLLC) category were selected. The acquired results showcase the effectiveness of our solutions.Postprint (author's final draft

    Learning Failure-Inducing Models for Testing Software-Defined Networks

    Get PDF
    Software-defined networks (SDN) enable flexible and effective communication systems, e.g., data centers, that are managed by centralized software controllers. However, such a controller can undermine the underlying communication network of an SDN-based system and thus must be carefully tested. When an SDN-based system fails, in order to address such a failure, engineers need to precisely understand the conditions under which it occurs. In this paper, we introduce a machine learning-guided fuzzing method, named FuzzSDN, aiming at both (1) generating effective test data leading to failures in SDN-based systems and (2) learning accurate failure-inducing models that characterize conditions under which such system fails. This is done in a synergistic manner where models guide test generation and the latter also aims at improving the models. To our knowledge, FuzzSDN is the first attempt to simultaneously address these two objectives for SDNs. We evaluate FuzzSDN by applying it to systems controlled by two open-source SDN controllers. Further, we compare FuzzSDN with two state-of-the-art methods for fuzzing SDNs and two baselines (i.e., simple extensions of these two existing methods) for learning failure-inducing models. Our results show that (1) compared to the state-of-the-art methods, FuzzSDN generates at least 12 times more failures, within the same time budget, with a controller that is fairly robust to fuzzing and (2) our failure-inducing models have, on average, a precision of 98% and a recall of 86%, significantly outperforming the baselines

    Detailed Review on The Denial of Service (DoS) and Distributed Denial of Service (DDoS) Attacks in Software Defined Networks (SDNs) and Defense Strategies

    Get PDF
    The development of Software Defined Networking (SDN) has altered the landscape of computer networking in recent years. Its scalable architecture has become a blueprint for the design of several advanced future networks. To achieve improve and efficient monitoring, control and management capabilities of the network, software defined networks differentiate or decouple the control logic from the data forwarding plane. As a result, logical control is centralized solely in the controller. Due to the centralized nature, SDNs are exposed to several vulnerabilities such as Spoofing, Flooding, and primarily Denial of Service (DoS) and Distributed Denial of Service (DDoS) among other attacks. In effect, the performance of SDN degrades based on these attacks. This paper presents a comprehensive review of several DoS and DDoS defense/mitigation strategies and classifies them into distinct classes with regards to the methodologies employed. Furthermore, suggestions were made to enhance current mitigation strategies accordingly

    Decentralized learning based indoor interference mitigation for 5G-and-beyond systems

    Get PDF
    Due to the explosive growth of data traffic and poor indoor coverage, ultra-dense network (UDN) has been introduced as a fundamental architectural technology for 5G-and-beyond systems. As the telecom operator is shifting to a plug-and-play paradigm in mobile networks, network planning and optimization become difficult and costly, especially in residential small-cell base stations (SBSs) deployment. Under this circumstance, severe inter-cell interference (ICI) becomes inevitable. Therefore, interference mitigation is of vital importance for indoor coverage in mobile communication systems. In this paper, we propose a fully distributed self-learning interference mitigation (SLIM) scheme for autonomous networks under a model-free multi-agent reinforcement learning (MARL) framework. In SLIM, individual SBSs autonomously perceive surrounding interferences and determine downlink transmit power without necessity of signaling interactions between SBSs for mitigating interferences. To tackle the dimensional disaster of joint action in the MARL model, we employ the Mean Field Theory to approximate the action value function to greatly decrease the computational complexity. Simulation results based on 3GPP dual-stripe urban model demonstrate that SLIM outperforms several existing known interference coordination schemes in mitigating interference and reducing power consumption while guaranteeing UEs' quality of service for autonomous UDNs

    Radio Resource Management for New Application Scenarios in 5G: Optimization and Deep Learning

    Get PDF
    The fifth-generation (5G) New Radio (NR) systems are expected to support a wide range of emerging applications with diverse Quality-of-Service (QoS) requirements. New application scenarios in 5G NR include enhanced mobile broadband (eMBB), massive machine-type communication (mMTC), and ultra-reliable low-latency communications (URLLC). New wireless architectures, such as full-dimension (FD) massive multiple-input multiple-output (MIMO) and mobile edge computing (MEC) system, and new coding scheme, such as short block-length channel coding, are envisioned as enablers of QoS requirements for 5G NR applications. Resource management in these new wireless architectures is crucial in guaranteeing the QoS requirements of 5G NR systems. The traditional optimization problems, such as subcarriers and user association, are usually non-convex or Non-deterministic Polynomial-time (NP)-hard. It is time-consuming and computing-expensive to find the optimal solution, especially in a large-scale network. To solve these problems, one approach is to design a low-complexity algorithm with near optimal performance. In some cases, the low complexity algorithms are hard to obtain, deep learning can be used as an accurate approximator that maps environment parameters, such as the channel state information and traffic state, to the optimal solutions. In this thesis, we design low-complexity optimization algorithms, and deep learning frameworks in different architectures of 5G NR to resolve optimization problems subject to QoS requirements. First, we propose a low-complexity algorithm for a joint cooperative beamforming and user association problem for eMBB in 5G NR to maximize the network capacity. Next, we propose a deep learning (DL) framework to optimize user association, resource allocation, and offloading probabilities for delay-tolerant services and URLLC in 5G NR. Finally, we address the issue of time-varying traffic and network conditions on resource management in 5G NR
    corecore