5,469 research outputs found
Adaptive Hierarchical Data Aggregation using Compressive Sensing (A-HDACS) for Non-smooth Data Field
Compressive Sensing (CS) has been applied successfully in a wide variety of
applications in recent years, including photography, shortwave infrared
cameras, optical system research, facial recognition, MRI, etc. In wireless
sensor networks (WSNs), significant research work has been pursued to
investigate the use of CS to reduce the amount of data communicated,
particularly in data aggregation applications and thereby improving energy
efficiency. However, most of the previous work in WSN has used CS under the
assumption that data field is smooth with negligible white Gaussian noise. In
these schemes signal sparsity is estimated globally based on the entire data
field, which is then used to determine the CS parameters. In more realistic
scenarios, where data field may have regional fluctuations or it is piecewise
smooth, existing CS based data aggregation schemes yield poor compression
efficiency. In order to take full advantage of CS in WSNs, we propose an
Adaptive Hierarchical Data Aggregation using Compressive Sensing (A-HDACS)
scheme. The proposed schemes dynamically chooses sparsity values based on
signal variations in local regions. We prove that A-HDACS enables more sensor
nodes to employ CS compared to the schemes that do not adapt to the changing
field. The simulation results also demonstrate the improvement in energy
efficiency as well as accurate signal recovery
Reliable recovery of hierarchically sparse signals for Gaussian and Kronecker product measurements
We propose and analyze a solution to the problem of recovering a block sparse
signal with sparse blocks from linear measurements. Such problems naturally
emerge inter alia in the context of mobile communication, in order to meet the
scalability and low complexity requirements of massive antenna systems and
massive machine-type communication. We introduce a new variant of the Hard
Thresholding Pursuit (HTP) algorithm referred to as HiHTP. We provide both a
proof of convergence and a recovery guarantee for noisy Gaussian measurements
that exhibit an improved asymptotic scaling in terms of the sampling complexity
in comparison with the usual HTP algorithm. Furthermore, hierarchically sparse
signals and Kronecker product structured measurements naturally arise together
in a variety of applications. We establish the efficient reconstruction of
hierarchically sparse signals from Kronecker product measurements using the
HiHTP algorithm. Additionally, we provide analytical results that connect our
recovery conditions to generalized coherence measures. Again, our recovery
results exhibit substantial improvement in the asymptotic sampling complexity
scaling over the standard setting. Finally, we validate in numerical
experiments that for hierarchically sparse signals, HiHTP performs
significantly better compared to HTP.Comment: 11+4 pages, 5 figures. V3: Incomplete funding information corrected
and minor typos corrected. V4: Change of title and additional author Axel
Flinth. Included new results on Kronecker product measurements and relations
of HiRIP to hierarchical coherence measures. Improved presentation of general
hierarchically sparse signals and correction of minor typo
Structured Sparsity: Discrete and Convex approaches
Compressive sensing (CS) exploits sparsity to recover sparse or compressible
signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity
is also used to enhance interpretability in machine learning and statistics
applications: While the ambient dimension is vast in modern data analysis
problems, the relevant information therein typically resides in a much lower
dimensional space. However, many solutions proposed nowadays do not leverage
the true underlying structure. Recent results in CS extend the simple sparsity
idea to more sophisticated {\em structured} sparsity models, which describe the
interdependency between the nonzero components of a signal, allowing to
increase the interpretability of the results and lead to better recovery
performance. In order to better understand the impact of structured sparsity,
in this chapter we analyze the connections between the discrete models and
their convex relaxations, highlighting their relative advantages. We start with
the general group sparse model and then elaborate on two important special
cases: the dispersive and the hierarchical models. For each, we present the
models in their discrete nature, discuss how to solve the ensuing discrete
problems and then describe convex relaxations. We also consider more general
structures as defined by set functions and present their convex proxies.
Further, we discuss efficient optimization solutions for structured sparsity
problems and illustrate structured sparsity in action via three applications.Comment: 30 pages, 18 figure
- …