17,889 research outputs found
Data Transmission with Reduced Delay for Distributed Acoustic Sensors
This paper proposes a channel access control scheme fit to dense acoustic
sensor nodes in a sensor network. In the considered scenario, multiple acoustic
sensor nodes within communication range of a cluster head are grouped into
clusters. Acoustic sensor nodes in a cluster detect acoustic signals and
convert them into electric signals (packets). Detection by acoustic sensors can
be executed periodically or randomly and random detection by acoustic sensors
is event driven. As a result, each acoustic sensor generates their packets
(50bytes each) periodically or randomly over short time intervals
(400ms~4seconds) and transmits directly to a cluster head (coordinator node).
Our approach proposes to use a slotted carrier sense multiple access. All
acoustic sensor nodes in a cluster are allocated to time slots and the number
of allocated sensor nodes to each time slot is uniform. All sensor nodes
allocated to a time slot listen for packet transmission from the beginning of
the time slot for a duration proportional to their priority. The first node
that detect the channel to be free for its whole window is allowed to transmit.
The order of packet transmissions with the acoustic sensor nodes in the time
slot is autonomously adjusted according to the history of packet transmissions
in the time slot. In simulations, performances of the proposed scheme are
demonstrated by the comparisons with other low rate wireless channel access
schemes.Comment: Accepted to IJDSN, final preprinted versio
An Online Parallel and Distributed Algorithm for Recursive Estimation of Sparse Signals
In this paper, we consider a recursive estimation problem for linear
regression where the signal to be estimated admits a sparse representation and
measurement samples are only sequentially available. We propose a convergent
parallel estimation scheme that consists in solving a sequence of
-regularized least-square problems approximately. The proposed scheme
is novel in three aspects: i) all elements of the unknown vector variable are
updated in parallel at each time instance, and convergence speed is much faster
than state-of-the-art schemes which update the elements sequentially; ii) both
the update direction and stepsize of each element have simple closed-form
expressions, so the algorithm is suitable for online (real-time)
implementation; and iii) the stepsize is designed to accelerate the convergence
but it does not suffer from the common trouble of parameter tuning in
literature. Both centralized and distributed implementation schemes are
discussed. The attractive features of the proposed algorithm are also
numerically consolidated.Comment: Part of this work has been presented at The Asilomar Conference on
Signals, Systems, and Computers, Nov. 201
Deep Predictive Coding Neural Network for RF Anomaly Detection in Wireless Networks
Intrusion detection has become one of the most critical tasks in a wireless
network to prevent service outages that can take long to fix. The sheer variety
of anomalous events necessitates adopting cognitive anomaly detection methods
instead of the traditional signature-based detection techniques. This paper
proposes an anomaly detection methodology for wireless systems that is based on
monitoring and analyzing radio frequency (RF) spectrum activities. Our
detection technique leverages an existing solution for the video prediction
problem, and uses it on image sequences generated from monitoring the wireless
spectrum. The deep predictive coding network is trained with images
corresponding to the normal behavior of the system, and whenever there is an
anomaly, its detection is triggered by the deviation between the actual and
predicted behavior. For our analysis, we use the images generated from the
time-frequency spectrograms and spectral correlation functions of the received
RF signal. We test our technique on a dataset which contains anomalies such as
jamming, chirping of transmitters, spectrum hijacking, and node failure, and
evaluate its performance using standard classifier metrics: detection ratio,
and false alarm rate. Simulation results demonstrate that the proposed
methodology effectively detects many unforeseen anomalous events in real time.
We discuss the applications, which encompass industrial IoT, autonomous vehicle
control and mission-critical communications services.Comment: 7 pages, 7 figures, Communications Workshop ICC'1
Self-Stabilizing TDMA Algorithms for Dynamic Wireless Ad-hoc Networks
In dynamic wireless ad-hoc networks (DynWANs), autonomous computing devices
set up a network for the communication needs of the moment. These networks
require the implementation of a medium access control (MAC) layer. We consider
MAC protocols for DynWANs that need to be autonomous and robust as well as have
high bandwidth utilization, high predictability degree of bandwidth allocation,
and low communication delay in the presence of frequent topological changes to
the communication network. Recent studies have shown that existing
implementations cannot guarantee the necessary satisfaction of these timing
requirements. We propose a self-stabilizing MAC algorithm for DynWANs that
guarantees a short convergence period, and by that, it can facilitate the
satisfaction of severe timing requirements, such as the above. Besides the
contribution in the algorithmic front of research, we expect that our proposal
can enable quicker adoption by practitioners and faster deployment of DynWANs
that are subject changes in the network topology
- …