4 research outputs found
Binary Adaptive Embeddings from Order Statistics of Random Projections
We use some of the largest order statistics of the random projections of a
reference signal to construct a binary embedding that is adapted to signals
correlated with such signal. The embedding is characterized from the analytical
standpoint and shown to provide improved performance on tasks such as
classification in a reduced-dimensionality space
Analysis of SparseHash: an efficient embedding of set-similarity via sparse projections
Embeddings provide compact representations of signals in order to perform
efficient inference in a wide variety of tasks. In particular, random
projections are common tools to construct Euclidean distance-preserving
embeddings, while hashing techniques are extensively used to embed
set-similarity metrics, such as the Jaccard coefficient. In this letter, we
theoretically prove that a class of random projections based on sparse
matrices, called SparseHash, can preserve the Jaccard coefficient between the
supports of sparse signals, which can be used to estimate set similarities.
Moreover, besides the analysis, we provide an efficient implementation and we
test the performance in several numerical experiments, both on synthetic and
real datasets.Comment: 25 pages, 6 figure
Deep Graph-Convolutional Image Denoising
Non-local self-similarity is well-known to be an effective prior for the
image denoising problem. However, little work has been done to incorporate it
in convolutional neural networks, which surpass non-local model-based methods
despite only exploiting local information. In this paper, we propose a novel
end-to-end trainable neural network architecture employing layers based on
graph convolution operations, thereby creating neurons with non-local receptive
fields. The graph convolution operation generalizes the classic convolution to
arbitrary graphs. In this work, the graph is dynamically computed from
similarities among the hidden features of the network, so that the powerful
representation learning capabilities of the network are exploited to uncover
self-similar patterns. We introduce a lightweight Edge-Conditioned Convolution
which addresses vanishing gradient and over-parameterization issues of this
particular graph convolution. Extensive experiments show state-of-the-art
performance with improved qualitative and quantitative results on both
synthetic Gaussian noise and real noise
Binary Adaptive Embeddings From Order Statistics of Random Projections
We use some of the largest order statistics of the random projections of a
reference signal to construct a binary embedding that is adapted to signals
correlated with such signal. The embedding is characterized from the analytical
standpoint and shown to provide improved performance on tasks such as
classification in a reduced-dimensionality space