10 research outputs found

    Traffic Prediction Based on Random Connectivity in Deep Learning with Long Short-Term Memory

    Full text link
    Traffic prediction plays an important role in evaluating the performance of telecommunication networks and attracts intense research interests. A significant number of algorithms and models have been put forward to analyse traffic data and make prediction. In the recent big data era, deep learning has been exploited to mine the profound information hidden in the data. In particular, Long Short-Term Memory (LSTM), one kind of Recurrent Neural Network (RNN) schemes, has attracted a lot of attentions due to its capability of processing the long-range dependency embedded in the sequential traffic data. However, LSTM has considerable computational cost, which can not be tolerated in tasks with stringent latency requirement. In this paper, we propose a deep learning model based on LSTM, called Random Connectivity LSTM (RCLSTM). Compared to the conventional LSTM, RCLSTM makes a notable breakthrough in the formation of neural network, which is that the neurons are connected in a stochastic manner rather than full connected. So, the RCLSTM, with certain intrinsic sparsity, have many neural connections absent (distinguished from the full connectivity) and which leads to the reduction of the parameters to be trained and the computational cost. We apply the RCLSTM to predict traffic and validate that the RCLSTM with even 35% neural connectivity still shows a satisfactory performance. When we gradually add training samples, the performance of RCLSTM becomes increasingly closer to the baseline LSTM. Moreover, for the input traffic sequences of enough length, the RCLSTM exhibits even superior prediction accuracy than the baseline LSTM.Comment: 6 pages, 9 figure

    Traffic-Profile and Machine Learning Based Regional Data Center Design and Operation for 5G Network

    Get PDF
    Data center in the fifth generation (5G) network will serve as a facilitator to move the wireless communication industry from a proprietary hardware based approach to a more software oriented environment. Techniques such as Software defined networking (SDN) and network function virtualization (NFV) would be able to deploy network functionalities such as service and packet gateways as software. These virtual functionalities however would require computational power from data centers. Therefore, these data centers need to be properly placed and carefully designed based on the volume of traffic they are meant to serve. In this work, we first divide the city of Milan, Italy into different zones using K-means clustering algorithm. We then analyse the traffic profiles of these zones in the city using a network operator’s Open Big Data set. We identify the optimal placement of data centers as a facility location problem and propose the use of Weiszfeld’s algorithm to solve it. Furthermore, based on our analysis of traffic profiles in different zones, we heuristically determine the ideal dimension of the data center in each zone. Additionally, to aid operation and facilitate dynamic utilization of data center resources, we use the state of the art recurrent neural network models to predict the future traffic demands according to past demand profiles of each area

    Intelligent resource scheduling for 5G radio access network slicing

    Get PDF
    It is widely acknowledged that network slicing can tackle the diverse use cases and connectivity services of the forthcoming next-generation mobile networks (5G). Resource scheduling is of vital importance for improving resource-multiplexing gain among slices while meeting specific service requirements for radio access network (RAN) slicing. Unfortunately, due to the performance isolation, diversified service requirements, and network dynamics (including user mobility and channel states), resource scheduling in RAN slicing is very challenging. In this paper, we propose an intelligent resource scheduling strategy (iRSS) for 5G RAN slicing. The main idea of an iRSS is to exploit a collaborative learning framework that consists of deep learning (DL) in conjunction with reinforcement learning (RL). Specifically, DL is used to perform large time-scale resource allocation, whereas RL is used to perform online resource scheduling for tackling small time-scale network dynamics, including inaccurate prediction and unexpected network states. Depending on the amount of available historical traffic data, an iRSS can flexibly adjust the significance between the prediction and online decision modules for assisting RAN in making resource scheduling decisions. Numerical results show that the convergence of an iRSS satisfies online resource scheduling requirements and can significantly improve resource utilization while guaranteeing performance isolation between slices, compared with other benchmark algorithms

    Network slice reconfiguration by exploiting deep reinforcement learning with large action space

    Get PDF
    It is widely acknowledged that network slicing can tackle the diverse usage scenarios and connectivity services that the 5G-and-beyond system needs to support. To guarantee performance isolation while maximizing network resource utilization under dynamic traffic load, network slice needs to be reconfigured adaptively. However, it is commonly believed that the fine-grained resource reconfiguration problem is intractable due to the extremely high computational complexity caused by numerous variables. In this paper, we investigate the reconfiguration within a core network slice with aim of minimizing long-term resource consumption by exploiting Deep Reinforcement Learning (DRL). This problem is also intractable by using conventional Deep Q Network (DQN), as it has a multi-dimensional discrete action space which is difficult to explore efficiently. To address the curse of dimensionality, we propose a discrete Branching Dueling Q-network (discrete BDQ) by incorporating the action branching architecture into DQN, for drastically decreasing the number of estimated actions. Based on the discrete BDQ network, we develop an intelligent network slice reconfiguration algorithm (INSRA). Extensive simulation experiments are conducted to evaluate the performance of INSRA and the numerical results reveal that INSRA can minimize the long-term resource consumption and achieve high resource efficiency compared with several benchmark algorithms

    AI-based resource management in future mobile networks

    Get PDF
    Η υποστίριξη και ενίσχυση των δίκτυων 5ης γενιάς και πέρα από αλγόριθμους Τεχνητής Νοημοσύνης για την επίλυση προβλημάτων βελτιστοποίησης δικτύου, μελετάται πρόσφατα προκειμένου η νέα γενιά των δικτύων να ανταποκριθεί στις απαιτήσεις ποιότητας υπηρεσίας σχετικά με την κάλυψη, τη χωρητικότητα των χρηστών και το κόστος εγκατάστασης. Μία από τις βασικές ανάγκες είναι η βελτιστοποίηση στην διαδικασία της εγκατάστασης σταθμών βάσης δικτύου. Σε αυτή την εργασία προτείνεται μια μετα-ευριστική μέθοδος, με όνομα «Γενετικός Αλγόριθμός» (Genetic Algorithm) για την επίλυση προβλημάτων βελτιστοποίησης λαμβάνοντας υπόψη τους περιορισμούς ζήτησης. Ο κύριος στόχος είναι η παρουσίαση της εναλλακτικής αυτής λύσης, η οποία είναι η χρήση του Γενετικού Αλγόριθμου, για τη βελτιστοποίηση της διαδικασίας εγκατάστασης των σταθμών βάσης του δικτύου. Με την χρήση του αλγορίθμου για την εγκατάσταση σταθμών βάσης παρέχονται οι ίδιες υπηρεσίες με πριν και ελαχιστοποιείται την κατανάλωση ενέργειας της υποδομής του δικτύου, λαμβάνοντας υπόψιν ομοιογενή και ετερογενή σενάρια σταθμών βάσης. Οι προσομοιώσεις πραγματοποιήθηκαν σε γλώσσα προγραμματισμού Python και τα καλύτερα αποτελέσματα εγκατάστασης παρουσιάστηκαν και αποθηκεύτηκαν. Έγινε σύγκριση της εγκατάστασης αποκλειστικά μακρο-σταθμών βάσης με μικρότερου μεγέθους (σε κάλυψη) σταθμών βάσης πάνω από την υπάρχουσα. Με την χρήση των μικρότερων σταθμών βάσης, η εγκατάσταση του δικτύου θα επιτρέψει βελτιώσεις στην κάλυψη των χρηστών και θα μειώσει το κόστος, την κατανάλωση ενέργειας και τις παρεμβολές μεταξύ των κυψελών. Όλα τα σενάρια μελετήθηκαν σε 3 περιοχές με διαφορετική πυκνότητα χρηστών (A, B και C). Ως προς την ικανοποίηση των απαιτήσεων αναφορικά με την ποιότητα υπηρεσιών και των κινητών συσκευών, η ανάπτυξη μικρών σταθμών βάσης είναι επωφελής, συγκεκριμένα σε περιοχές hotspot.The 5G and beyond networks supported by Artificial Intelligence algorithms in solving network optimization problems are recently studied to meet the quality-of-service requirements regarding coverage, capacity, and cost. One of the essential necessities is the optimized deployment of network base stations. This work proposes the meta-heuristic algorithm Genetic Algorithm to solve optimization problems considering the demand constraints. The main goal is present the alternative solution, which is using the Genetic Algorithm to optimize BSs network deployment. This deployment provides the same services as existing deployments and minimizes the network infrastructure's energy consumption, including using homogenous and heterogenous scenarios of base stations. The simulations were performed in Python programming language, and the results as the best plans for each generation were presented and saved. A comparison of the macro base station deployment and small base station deployment was made on top of the existing one. By applying the small base stations, the network deployment will enable user coverage enhancements and reduce the deployment cost, energy consumption, and inter-cell interference. All the scenarios were assembled in user density area A, user density area B, and user density area C areas of interest. In meeting the requirements for QoS and UE, the small base station deployment is beneficial, namely in hotspot areas

    The Learning and Prediction of Application-Level Traffic Data in Cellular Networks

    No full text
    corecore