1,552 research outputs found
An Overview of Multi-Processor Approximate Message Passing
Approximate message passing (AMP) is an algorithmic framework for solving
linear inverse problems from noisy measurements, with exciting applications
such as reconstructing images, audio, hyper spectral images, and various other
signals, including those acquired in compressive signal acquisiton systems. The
growing prevalence of big data systems has increased interest in large-scale
problems, which may involve huge measurement matrices that are unsuitable for
conventional computing systems. To address the challenge of large-scale
processing, multiprocessor (MP) versions of AMP have been developed. We provide
an overview of two such MP-AMP variants. In row-MP-AMP, each computing node
stores a subset of the rows of the matrix and processes corresponding
measurements. In column- MP-AMP, each node stores a subset of columns, and is
solely responsible for reconstructing a portion of the signal. We will discuss
pros and cons of both approaches, summarize recent research results for each,
and explain when each one may be a viable approach. Aspects that are
highlighted include some recent results on state evolution for both MP-AMP
algorithms, and the use of data compression to reduce communication in the MP
network
Efficient LDPC Codes over GF(q) for Lossy Data Compression
In this paper we consider the lossy compression of a binary symmetric source.
We present a scheme that provides a low complexity lossy compressor with near
optimal empirical performance. The proposed scheme is based on b-reduced
ultra-sparse LDPC codes over GF(q). Encoding is performed by the Reinforced
Belief Propagation algorithm, a variant of Belief Propagation. The
computational complexity at the encoder is O(.n.q.log q), where is the
average degree of the check nodes. For our code ensemble, decoding can be
performed iteratively following the inverse steps of the leaf removal
algorithm. For a sparse parity-check matrix the number of needed operations is
O(n).Comment: 5 pages, 3 figure
- …