2,632 research outputs found

    Deep Predictive Coding Neural Network for RF Anomaly Detection in Wireless Networks

    Full text link
    Intrusion detection has become one of the most critical tasks in a wireless network to prevent service outages that can take long to fix. The sheer variety of anomalous events necessitates adopting cognitive anomaly detection methods instead of the traditional signature-based detection techniques. This paper proposes an anomaly detection methodology for wireless systems that is based on monitoring and analyzing radio frequency (RF) spectrum activities. Our detection technique leverages an existing solution for the video prediction problem, and uses it on image sequences generated from monitoring the wireless spectrum. The deep predictive coding network is trained with images corresponding to the normal behavior of the system, and whenever there is an anomaly, its detection is triggered by the deviation between the actual and predicted behavior. For our analysis, we use the images generated from the time-frequency spectrograms and spectral correlation functions of the received RF signal. We test our technique on a dataset which contains anomalies such as jamming, chirping of transmitters, spectrum hijacking, and node failure, and evaluate its performance using standard classifier metrics: detection ratio, and false alarm rate. Simulation results demonstrate that the proposed methodology effectively detects many unforeseen anomalous events in real time. We discuss the applications, which encompass industrial IoT, autonomous vehicle control and mission-critical communications services.Comment: 7 pages, 7 figures, Communications Workshop ICC'1

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Machine learning based anomaly detection in release testing of 5g mobile networks

    Get PDF
    Abstract. The need of high-quality phone and internet connections, high-speed streaming ability and reliable traffic with no interruptions has increased because of the advancements the wireless communication world witnessed since the start of 5G (fifth generation) networks. The amount of data generated, not just every day but also, every second made most of the traditional approaches or statistical methods used previously for data manipulation and modeling inefficient and unscalable. Machine learning (ML) and especially, the deep learning (DL)-based models achieve the state-of-art results because of their ability to recognize complex patterns that even human experts are not able to recognize. Machine learning-based anomaly detection is one of the current hot topics in both research and industry because of its practical applications in almost all domains. Anomaly detection is mainly used for two purposes. The first purpose is to understand why this anomalous behavior happens and as a result, try to prevent it from happening by solving the root cause of the problem. The other purpose is to, as well, understand why this anomalous behavior happens and try to be ready for dealing with this behavior as it would be predictable behavior in that case, such as the increased traffic through the weekends or some specific hours of the day. In this work, we apply anomaly detection on a univariate time series target, the block error rate (BLER). We experiment with different statistical approaches, classic supervised machine learning models, unsupervised machine learning models, and deep learning models and benchmark the final results. The main goal is to select the best model that achieves the balance of the best performance and less resources and apply it in a multivariate time series context where we are able to test the relationship between the different time series features and their influence on each other. Through the final phase, the model selected will be used, integrated, and deployed as part of an automatic system that detects and flags anomalies in real-time. The simple proposed deep learning model outperforms the other models in terms of the accuracy related metrics. We also emphasize the acceptable performance of the statistical approach that enters the competition of the best model due to its low training time and required computational resources

    Enhanced Machine Learning Techniques for Early HARQ Feedback Prediction in 5G

    Full text link
    We investigate Early Hybrid Automatic Repeat reQuest (E-HARQ) feedback schemes enhanced by machine learning techniques as a path towards ultra-reliable and low-latency communication (URLLC). To this end, we propose machine learning methods to predict the outcome of the decoding process ahead of the end of the transmission. We discuss different input features and classification algorithms ranging from traditional methods to newly developed supervised autoencoders. These methods are evaluated based on their prospects of complying with the URLLC requirements of effective block error rates below 10−510^{-5} at small latency overheads. We provide realistic performance estimates in a system model incorporating scheduling effects to demonstrate the feasibility of E-HARQ across different signal-to-noise ratios, subcode lengths, channel conditions and system loads, and show the benefit over regular HARQ and existing E-HARQ schemes without machine learning.Comment: 14 pages, 15 figures; accepted versio

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Engineering the application of machine learning in an IDS based on IoT traffic flow

    Get PDF
    Internet of Things (IoT) devices are now widely used, enabling intelligent services that, in association with new communication technologies like the 5G and broadband internet, boost smart-city environments. Despite their limited resources, IoT devices collect and share large amounts of data and are connected to the internet, becoming an attractive target for malicious actors. This work uses machine learning combined with an Intrusion Detection System (IDS) to detect possible attacks. Due to the limitations of IoT devices and low latency services, the IDS must have a specialized architecture. Furthermore, although machine learning-based solutions have high potential, there are still challenges related to training and generalization, which may impose constraints on the architecture. Our proposal is an IDS with a distributed architecture that relies on Fog computing to run specialized modules and use deep neural networks to identify malicious traffic inside IoT data flows. We compare our IoT-Flow IDS with three other architectures. We assess model generalization using test data from different datasets and evaluate their performance in terms of Recall, Precision, and F1-Score. Results confirm the feasibility of flowbased anomaly detection and the importance of network traffic segmentation and specialized models in the AI-based IDS for IoT.info:eu-repo/semantics/publishedVersio
    • …
    corecore