924 research outputs found
Use of the Karhunen-Loève Transform for interference detection and mitigation in GNSS
Improving the Global Navigation Satellite System (GNSS) receiver robustness in a radio interfered environment has
been always one of the main concerns for the GNSS community. Due to the weakness of the signal impinging the
GNSS receiver antenna, GNSS receiver performance can be seriously threatened by the presence of stronger interfering
signals. In these scenarios, classical interference countermeasures may fail due to the fact that interference detection
and removal process causes also a non-negligible degradation of the received GNSS signal. This paper introduces an
innovative interference detection and mitigation technique against the well-known jamming threat. This technique is
based on the use of the Karhunen-Lo`eve Transform (KLT) which allows for the representation of the received interfered
signals in a transformed domain where interference components can be better identified, isolated and removed, avoiding
significant degradation of the useful GNSS signal
Deployment and Implementation Aspects of Radio Frequency Fingerprinting in Cybersecurity of Smart Grids
Smart grids incorporate diverse power equipment used for energy optimization in intelligent cities. This equipment may use Internet of Things (IoT) devices and services in the future. To ensure stable operation of smart grids, cybersecurity of IoT is paramount. To this end, use of cryptographic security methods is prevalent in existing IoT. Non-cryptographic methods such as radio frequency fingerprinting (RFF) have been on the horizon for a few decades but are limited to academic research or military interest. RFF is a physical layer security feature that leverages hardware impairments in radios of IoT devices for classification and rogue device detection. The article discusses the potential of RFF in wireless communication of IoT devices to augment the cybersecurity of smart grids. The characteristics of a deep learning (DL)-aided RFF system are presented. Subsequently, a deployment framework of RFF for smart grids is presented with implementation and regulatory aspects. The article culminates with a discussion of existing challenges and potential research directions for maturation of RFF.publishedVersio
An Assessment of Impact of Adaptive Notch Filters for Interference Removal on the Signal Processing Stages of a GNSS Receiver
With the fast growing diffusion of the real-time high accuracy applications based on the Global Navigation Satellite System (GNSS), the robustness of the GNSS receiver performance has become a compelling requirement. Disruptive effects can be induced to the signal processing stages of GNSS receivers due to the disturbances from Radio-Frequency Interference (RFI), even leading to a complete outage of the positioning and timing service. A typical RFI threat to the GNSS signals is represented by portable jammers which transmit swept-frequency (chirp) signals in order to span the overall GNSS bandwidth. The implementation in the receivers of Adaptive Notch Filters (ANFs) for chirp cancellation has been extensively investigated and proved to be an efficient countermeasure. However, the performance of ANF is strongly dependent on its configuration setup. Inappropriate parameter settings of the ANF for interference removal may induce severe distortion to the correlation process. In addition, an effective mitigation will still introduce a vestigial signal distortion contributed by the residual unmitigated chirp and the ANF operation itself, being not negligible for high accuracy solutions. This paper addresses the detailed analysis for assessing the effects of interference mitigation by notch filtering. A bias compensation strategy is proposed, wherein for each Pseudo Random Noise (PRN) the biases due to the parameter settings of the notch filter are estimated and compensated. The impact of using the ANF operation on chirp signals at the acquisition and tracking stages of GNSS receivers is analyzed. On the basis of the three proposed metrics, the effects can be quantitatively estimated to depict a complete picture of the most influential parameters of the chirp and the ANF configurations, as well as the optimal achievable performance at the acquisition and tracking stages
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
- …