4,239 research outputs found
CSWA: Aggregation-Free Spatial-Temporal Community Sensing
In this paper, we present a novel community sensing paradigm -- {C}ommunity
{S}ensing {W}ithout {A}ggregation}. CSWA is designed to obtain the environment
information (e.g., air pollution or temperature) in each subarea of the target
area, without aggregating sensor and location data collected by community
members. CSWA operates on top of a secured peer-to-peer network over the
community members and proposes a novel \emph{Decentralized Spatial-Temporal
Compressive Sensing} framework based on \emph{Parallelized Stochastic Gradient
Descent}. Through learning the \emph{low-rank structure} via distributed
optimization, CSWA approximates the value of the sensor data in each subarea
(both covered and uncovered) for each sensing cycle using the sensor data
locally stored in each member's mobile device. Simulation experiments based on
real-world datasets demonstrate that CSWA exhibits low approximation error
(i.e., less than C in city-wide temperature sensing task and
units of PM2.5 index in urban air pollution sensing) and performs comparably to
(sometimes better than) state-of-the-art algorithms based on the data
aggregation and centralized computation.Comment: This paper has been accepted by AAAI 2018. First two authors are
equally contribute
Monte Carlo optimization approach for decentralized estimation networks under communication constraints
We consider designing decentralized estimation schemes over bandwidth limited communication links with a particular interest in the tradeoff between the estimation accuracy and the cost of communications due to, e.g., energy
consumption. We take two classes of inânetwork processing strategies into account which yield graph representations through modeling the sensor platforms as the vertices and the communication links by edges as well as a tractable
Bayesian risk that comprises the cost of transmissions and penalty for the estimation errors. This approach captures a broad range of possibilities for âonlineâ processing of observations as well as the constraints imposed and enables a rigorous design setting in the form of a constrained optimization problem. Similar schemes as well as the structures exhibited by the solutions to the design problem has been studied previously in the context of decentralized detection. Under reasonable assumptions, the optimization can be carried out in a message passing fashion. We adopt this framework for estimation, however, the corresponding optimization schemes involve integral operators that cannot
be evaluated exactly in general. We develop an approximation framework using Monte Carlo methods and obtain particle representations and approximate computational schemes for both classes of inânetwork processing strategies
and their optimization. The proposed Monte Carlo optimization procedures operate in a scalable and efficient fashion and, owing to the non-parametric nature, can produce results for any distributions provided that samples can be
produced from the marginals. In addition, this approach exhibits graceful degradation of the estimation accuracy asymptotically as the communication becomes more costly, through a parameterized Bayesian risk
Monte Carlo optimization approach for decentralized estimation networks under communication constraints
We consider designing decentralized estimation schemes over bandwidth limited communication links with a particular interest in the tradeoff between the estimation accuracy and the cost of communications due to, e.g., energy
consumption. We take two classes of inânetwork processing strategies into account which yield graph representations through modeling the sensor platforms as the vertices and the communication links by edges as well as a tractable
Bayesian risk that comprises the cost of transmissions and penalty for the estimation errors. This approach captures a broad range of possibilities for âonlineâ processing of observations as well as the constraints imposed and enables a rigorous design setting in the form of a constrained optimization problem. Similar schemes as well as the structures exhibited by the solutions to the design problem has been studied previously in the context of decentralized detection. Under reasonable assumptions, the optimization can be carried out in a message passing fashion. We adopt this framework for estimation, however, the corresponding optimization schemes involve integral operators that cannot
be evaluated exactly in general. We develop an approximation framework using Monte Carlo methods and obtain particle representations and approximate computational schemes for both classes of inânetwork processing strategies
and their optimization. The proposed Monte Carlo optimization procedures operate in a scalable and efficient fashion and, owing to the non-parametric nature, can produce results for any distributions provided that samples can be
produced from the marginals. In addition, this approach exhibits graceful degradation of the estimation accuracy asymptotically as the communication becomes more costly, through a parameterized Bayesian risk
Doped Fountain Coding for Minimum Delay Data Collection in Circular Networks
This paper studies decentralized, Fountain and network-coding based
strategies for facilitating data collection in circular wireless sensor
networks, which rely on the stochastic diversity of data storage. The goal is
to allow for a reduced delay collection by a data collector who accesses the
network at a random position and random time. Data dissemination is performed
by a set of relays which form a circular route to exchange source packets. The
storage nodes within the transmission range of the route's relays linearly
combine and store overheard relay transmissions using random decentralized
strategies. An intelligent data collector first collects a minimum set of coded
packets from a subset of storage nodes in its proximity, which might be
sufficient for recovering the original packets and, by using a message-passing
decoder, attempts recovering all original source packets from this set.
Whenever the decoder stalls, the source packet which restarts decoding is
polled/doped from its original source node. The random-walk-based analysis of
the decoding/doping process furnishes the collection delay analysis with a
prediction on the number of required doped packets. The number of doped packets
can be surprisingly small when employed with an Ideal Soliton code degree
distribution and, hence, the doping strategy may have the least collection
delay when the density of source nodes is sufficiently large. Furthermore, we
demonstrate that network coding makes dissemination more efficient at the
expense of a larger collection delay. Not surprisingly, a circular network
allows for a significantly more (analytically and otherwise) tractable
strategies relative to a network whose model is a random geometric graph
Simultaneous Distributed Sensor Self-Localization and Target Tracking Using Belief Propagation and Likelihood Consensus
We introduce the framework of cooperative simultaneous localization and
tracking (CoSLAT), which provides a consistent combination of cooperative
self-localization (CSL) and distributed target tracking (DTT) in sensor
networks without a fusion center. CoSLAT extends simultaneous localization and
tracking (SLAT) in that it uses also intersensor measurements. Starting from a
factor graph formulation of the CoSLAT problem, we develop a particle-based,
distributed message passing algorithm for CoSLAT that combines nonparametric
belief propagation with the likelihood consensus scheme. The proposed CoSLAT
algorithm improves on state-of-the-art CSL and DTT algorithms by exchanging
probabilistic information between CSL and DTT. Simulation results demonstrate
substantial improvements in both self-localization and tracking performance.Comment: 10 pages, 5 figure
Networked Computing in Wireless Sensor Networks for Structural Health Monitoring
This paper studies the problem of distributed computation over a network of
wireless sensors. While this problem applies to many emerging applications, to
keep our discussion concrete we will focus on sensor networks used for
structural health monitoring. Within this context, the heaviest computation is
to determine the singular value decomposition (SVD) to extract mode shapes
(eigenvectors) of a structure. Compared to collecting raw vibration data and
performing SVD at a central location, computing SVD within the network can
result in significantly lower energy consumption and delay. Using recent
results on decomposing SVD, a well-known centralized operation, into
components, we seek to determine a near-optimal communication structure that
enables the distribution of this computation and the reassembly of the final
results, with the objective of minimizing energy consumption subject to a
computational delay constraint. We show that this reduces to a generalized
clustering problem; a cluster forms a unit on which a component of the overall
computation is performed. We establish that this problem is NP-hard. By
relaxing the delay constraint, we derive a lower bound to this problem. We then
propose an integer linear program (ILP) to solve the constrained problem
exactly as well as an approximate algorithm with a proven approximation ratio.
We further present a distributed version of the approximate algorithm. We
present both simulation and experimentation results to demonstrate the
effectiveness of these algorithms
Homology-based Distributed Coverage Hole Detection in Wireless Sensor Networks
Homology theory provides new and powerful solutions to address the coverage
problems in wireless sensor networks (WSNs). They are based on algebraic
objects, such as Cech complex and Rips complex. Cech complex gives accurate
information about coverage quality but requires a precise knowledge of the
relative locations of nodes. This assumption is rather strong and hard to
implement in practical deployments. Rips complex provides an approximation of
Cech complex. It is easier to build and does not require any knowledge of nodes
location. This simplicity is at the expense of accuracy. Rips complex can not
always detect all coverage holes. It is then necessary to evaluate its
accuracy. This work proposes to use the proportion of the area of undiscovered
coverage holes as performance criteria. Investigations show that it depends on
the ratio between communication and sensing radii of a sensor. Closed-form
expressions for lower and upper bounds of the accuracy are also derived. For
those coverage holes which can be discovered by Rips complex, a homology-based
distributed algorithm is proposed to detect them. Simulation results are
consistent with the proposed analytical lower bound, with a maximum difference
of 0.5%. Upper bound performance depends on the ratio of communication and
sensing radii. Simulations also show that the algorithm can localize about 99%
coverage holes in about 99% cases
Submodularity and Optimality of Fusion Rules in Balanced Binary Relay Trees
We study the distributed detection problem in a balanced binary relay tree,
where the leaves of the tree are sensors generating binary messages. The root
of the tree is a fusion center that makes the overall decision. Every other
node in the tree is a fusion node that fuses two binary messages from its child
nodes into a new binary message and sends it to the parent node at the next
level. We assume that the fusion nodes at the same level use the same fusion
rule. We call a string of fusion rules used at different levels a fusion
strategy. We consider the problem of finding a fusion strategy that maximizes
the reduction in the total error probability between the sensors and the fusion
center. We formulate this problem as a deterministic dynamic program and
express the solution in terms of Bellman's equations. We introduce the notion
of stringsubmodularity and show that the reduction in the total error
probability is a stringsubmodular function. Consequentially, we show that the
greedy strategy, which only maximizes the level-wise reduction in the total
error probability, is within a factor of the optimal strategy in terms of
reduction in the total error probability
- âŠ