10 research outputs found

    Mobile internet activity estimation and analysis at high granularity: SVR model approach

    Get PDF
    Understanding of mobile internet traffic patterns and capacity to estimate future traffic, particularly at high spatiotemporal granularity, is crucial for proactive decision making in emerging and future cognizant cellular networks enabled with self-organizing features. It becomes even more important in the world of `Internet of Things' with machines communicating locally. In this paper, internet activity data from a mobile network operator Call Detail Records (CDRs) is analysed at high granularity to study the spatiotemporal variance and traffic patterns. To estimate future traffic at high granularity, a Support Vector Regression (SVR) based traffic model is trained and evaluated for the prediction of maximum, minimum and average internet traffic in the next hour based on the actual traffic in the last hour. Performance of the model is compared with that of the State-of-the-Art (SOTA) deep learning models recently proposed in the literature for the same data, same granularity, and same predicates. It is concluded that this SVR model outperforms the SOTA deep and non-deep learning methods used in the literature

    Machine learning adaptive computational capacity prediction for dynamic resource management in C-RAN

    Get PDF
    Efficient computational resource management in 5G Cloud Radio Access Network (C-RAN)environments is a challenging problem because it has to account simultaneously for throughput, latency,power efficiency, and optimization tradeoffs. The assumption of a fixed computational capacity at thebaseband unit (BBU) pools may result in underutilized or oversubscribed resources, thus affecting the overallQuality of Service (QoS). As resources are virtualized at the BBU pools, they could be dynamically instan-tiated according to the required computational capacity (RCC). In this paper, a new strategy for DynamicResource Management with Adaptive Computational capacity (DRM-AC) using machine learning (ML)techniques is proposed. Three ML algorithms have been tested to select the best predicting approach: supportvector machine (SVM), time-delay neural network (TDNN), and long short-term memory (LSTM). DRM-AC reduces the average of unused resources by 96 %, but there is still QoS degradation when RCC is higherthan the predicted computational capacity (PCC). To further improve, two new strategies are proposed andtested in a realistic scenario: DRM-AC with pre-filtering (DRM-AC-PF) and DRM-AC with error shifting(DRM-AC-ES), reducing the average of unsatisfied resources by 98 % and 99.9 % compared to the DRM-AC, respectivelyThis work was supported in part by the Spanish ministry of science through the project CRIN-5G (RTI2018-099880-B-C32) withERDF (European Regional Development Fund) and in part by the UPC through COST CA15104 IRACON EU Project and theFPI-UPC-2018 Grant.Peer ReviewedPostprint (published version

    Traffic-Profile and Machine Learning Based Regional Data Center Design and Operation for 5G Network

    Get PDF
    Data center in the fifth generation (5G) network will serve as a facilitator to move the wireless communication industry from a proprietary hardware based approach to a more software oriented environment. Techniques such as Software defined networking (SDN) and network function virtualization (NFV) would be able to deploy network functionalities such as service and packet gateways as software. These virtual functionalities however would require computational power from data centers. Therefore, these data centers need to be properly placed and carefully designed based on the volume of traffic they are meant to serve. In this work, we first divide the city of Milan, Italy into different zones using K-means clustering algorithm. We then analyse the traffic profiles of these zones in the city using a network operator’s Open Big Data set. We identify the optimal placement of data centers as a facility location problem and propose the use of Weiszfeld’s algorithm to solve it. Furthermore, based on our analysis of traffic profiles in different zones, we heuristically determine the ideal dimension of the data center in each zone. Additionally, to aid operation and facilitate dynamic utilization of data center resources, we use the state of the art recurrent neural network models to predict the future traffic demands according to past demand profiles of each area

    PHYSICAL LAYER SECURITY IN THE 5G HETEROGENEOUS WIRELESS SYSTEM WITH IMPERFECT CSI

    Get PDF
    5G is expected to serve completely heterogeneous scenarios where devices with low or high software and hardware complexity will coexist. This entails a security challenge because low complexity devices such as IoT sensors must still have secrecy in their communications. This project proposes tools to maximize the secrecy rate in a scenario with legitimate users and eavesdroppers considering: i) the limitation that low complexity users have in computational power and ii) the eavesdroppers? unwillingness to provide their channel state information to the base station. The tools have been designed based on the physical layer security field and solve the resource allocation from two different approaches that are suitable in different use cases: i) using convex optimization theory or ii) using classification neural networks. Results show that, while the convex approach provides the best secrecy performance, the learning approach is a good alternative for dynamic scenarios or when wanting to save transmitting power

    Novel applications of Machine Learning to Network Traffic Analysis and Prediction

    Get PDF
    It is now clear that machine learning will be widely used in future telecommunication networks as it is increasingly used in today's networks. However, despite its increasing application and its enormous potential, there are still many areas in which the new techniques developed in the area of machine learning are not yet fully utilized. The aim of this thesis is to present the application of innovative techniques of machine learning (ML-Machine Learning) in the field of Telecommunications, and specifically to problems related to the analysis and prediction of traffic in data networks (NTAP - Network Traffic Analysis and Prediction). The applications of NTAP are very broad, so this thesis focuses on the following five specific areas: - Prediction of connectivity of wireless devices. - Security intrusion detection, using network traffic information - Classification of network traffic, using the headers of the transmitted network packets - Estimation of the quality of the experience perceived by the user (QoE) when viewing multimedia streaming, using aggregate information of the network packets - Generation of synthetic traffic associated with security attacks and use of that synthetic traffic to improve security intrusion detection algorithms. The final intention is to create prediction and analysis models that produce improvements in the NTAP areas mentioned above. With this objective, this thesis provides advances in the application of machine learning techniques to the area of NTAP. These advances consist of: - Development of new machine learning models and architectures for NTAP - Define new ways to structure and transform training data so that existing machine learning models can be applied to specific NTAP problems. - Define algorithms for the creation of synthetic network traffic associated with specific events in the operation of the network (for example, specific types of intrusions), ensuring that the new synthetic data can be used as new training data. - Extension and application of classic models of machine learning to the area of NTAP, obtaining improvements in the classification or regression metrics and/or improvements in the performance measures of the algorithms (e.g. training time, prediction time, memory needs, ...).Departamento de Teoría de la Señal y Comunicaciones e Ingeniería TelemáticaDoctorado en Tecnologías de la Información y las Telecomunicacione
    corecore