1,956 research outputs found

    Deep Predictive Coding Neural Network for RF Anomaly Detection in Wireless Networks

    Full text link
    Intrusion detection has become one of the most critical tasks in a wireless network to prevent service outages that can take long to fix. The sheer variety of anomalous events necessitates adopting cognitive anomaly detection methods instead of the traditional signature-based detection techniques. This paper proposes an anomaly detection methodology for wireless systems that is based on monitoring and analyzing radio frequency (RF) spectrum activities. Our detection technique leverages an existing solution for the video prediction problem, and uses it on image sequences generated from monitoring the wireless spectrum. The deep predictive coding network is trained with images corresponding to the normal behavior of the system, and whenever there is an anomaly, its detection is triggered by the deviation between the actual and predicted behavior. For our analysis, we use the images generated from the time-frequency spectrograms and spectral correlation functions of the received RF signal. We test our technique on a dataset which contains anomalies such as jamming, chirping of transmitters, spectrum hijacking, and node failure, and evaluate its performance using standard classifier metrics: detection ratio, and false alarm rate. Simulation results demonstrate that the proposed methodology effectively detects many unforeseen anomalous events in real time. We discuss the applications, which encompass industrial IoT, autonomous vehicle control and mission-critical communications services.Comment: 7 pages, 7 figures, Communications Workshop ICC'1

    Prediction-Based Energy Saving Mechanism in 3GPP NB-IoT Networks

    Get PDF
    The current expansion of the Internet of things (IoT) demands improved communication platforms that support a wide area with low energy consumption. The 3rd Generation Partnership Project introduced narrowband IoT (NB-IoT) as IoT communication solutions. NB-IoT devices should be available for over 10 years without requiring a battery replacement. Thus, a low energy consumption is essential for the successful deployment of this technology. Given that a high amount of energy is consumed for radio transmission by the power amplifier, reducing the uplink transmission time is key to ensure a long lifespan of an IoT device. In this paper, we propose a prediction-based energy saving mechanism (PBESM) that is focused on enhanced uplink transmission. The mechanism consists of two parts: first, the network architecture that predicts the uplink packet occurrence through a deep packet inspection; second, an algorithm that predicts the processing delay and pre-assigns radio resources to enhance the scheduling request procedure. In this way, our mechanism reduces the number of random accesses and the energy consumed by radio transmission. Simulation results showed that the energy consumption using the proposed PBESM is reduced by up to 34% in comparison with that in the conventional NB-IoT method

    Cell degradation detection based on an inter-cell approach

    Get PDF
    Fault management is a crucial part of cellular network management systems. The status of the base stations is usually monitored by well-defined key performance indicators (KPIs). The approaches for cell degradation detection are based on either intra-cell or inter-cell analysis of the KPIs. In intra-cell analysis, KPI profiles are built based on their local history data whereas in inter-cell analysis, KPIs of one cell are compared with the corresponding KPIs of the other cells. In this work, we argue in favor of the inter-cell approach and apply a degradation detection method that is able to detect a sleeping cell that could be difficult to observe using traditional intra-cell methods. We demonstrate its use for detecting emulated degradations among performance data recorded from a live LTE network. The method can be integrated in current systems because it can operate using existing KPIs without any major modification to the network infrastructure

    PSUN: An OFDM-Pulsed Radar Coexistence Technique with Application to 3.5 GHz LTE

    Get PDF

    Traffic classification and prediction, and fast uplink grant allocation for machine type communications via support vector machines and long short-term memory

    Get PDF
    Abstract. The current random access (RA) allocation techniques suffer from congestion and high signaling overhead while serving machine type communication (MTC) applications. Therefore, 3GPP has introduced the need to use fast uplink grant (FUG) allocation. This thesis proposes a novel FUG allocation based on support vector machine (SVM) and long short-term memory (LSTM). First, MTC devices are prioritized using SVM classifier. Second, LSTM architecture is used to predict activation time of each device. Both results are used to achieve an efficient resource scheduler in terms of the average latency and total throughput. Furthermore, a set of correction techniques is introduced to overcome the classification and prediction errors. The Coupled Markov Modulated Poisson Process (CMMPP) traffic model is applied to compare the proposed FUG allocation to other existing allocation techniques. In addition, an extended traffic model based CMMPP is used to evaluate the proposed algorithm in a more dense network. Our simulation results show the proposed model outperforms the existing RA allocation schemes by achieving the highest throughput and the lowest access delay when serving the target massive and critical MTC applications

    A case study: Failure prediction in a real LTE network

    Get PDF
    Mobile traffic and number of connected devices have been increasing exponentially nowadays, with customer expectation from mobile operators in term of quality and reliability is higher and higher. This places pressure on operators to invest as well as to operate their growing infrastructures. As such, telecom network management becomes an essential problem. To reduce cost and maintain network performance, operators need to bring more automation and intelligence into their management system. Self-Organizing Networks function (SON) is an automation technology aiming to maximize performance in mobility networks by bringing autonomous adaptability and reducing human intervention in network management and operations. Three main areas of SON include self-configuration (auto-configuration when new element enter the network), self-optimization (optimization of the network parameters during operation) and self-healing (maintenance). The main purpose of the thesis is to illustrate how anomaly detection methods can be applied to SON functions, in particularly self-healing functions such as fault detection and cell outage management. The thesis is illustrated by a case study, in which the anomalies - in this case, the failure alarms, are predicted in advance using performance measurement data (PM data) collected from a real LTE network within a certain timeframe. Failures prediction or anomalies detection can help reduce cost and maintenance time in mobile network base stations. The author aims to answer the research questions: what anomaly detection models could detect the anomalies in advance, and what type of anomalies can be well-detected using those models. Using cross-validation, the thesis shows that random forest method is the best performing model out of the chosen ones, with F1-score of 0.58, 0.96 and 0.52 for the anomalies: Failure in Optical Interface, Temperature alarm, and VSWR minor alarm respectively. Those are also the anomalies can be well-detected by the model

    What Can Wireless Cellular Technologies Do about the Upcoming Smart Metering Traffic?

    Full text link
    The introduction of smart electricity meters with cellular radio interface puts an additional load on the wireless cellular networks. Currently, these meters are designed for low duty cycle billing and occasional system check, which generates a low-rate sporadic traffic. As the number of distributed energy resources increases, the household power will become more variable and thus unpredictable from the viewpoint of the Distribution System Operator (DSO). It is therefore expected, in the near future, to have an increased number of Wide Area Measurement System (WAMS) devices with Phasor Measurement Unit (PMU)-like capabilities in the distribution grid, thus allowing the utilities to monitor the low voltage grid quality while providing information required for tighter grid control. From a communication standpoint, the traffic profile will change drastically towards higher data volumes and higher rates per device. In this paper, we characterize the current traffic generated by smart electricity meters and supplement it with the potential traffic requirements brought by introducing enhanced Smart Meters, i.e., meters with PMU-like capabilities. Our study shows how GSM/GPRS and LTE cellular system performance behaves with the current and next generation smart meters traffic, where it is clearly seen that the PMU data will seriously challenge these wireless systems. We conclude by highlighting the possible solutions for upgrading the cellular standards, in order to cope with the upcoming smart metering traffic.Comment: Submitted; change: corrected location of eSM box in Fig. 1; May 22, 2015: Major revision after review; v4: revised, accepted for publicatio
    • 

    corecore