26,195 research outputs found
Per-channel regularization for regression-based spectral reconstruction
Spectral reconstruction algorithms seek to recover spectra from RGB images. This estimation problem is often formulated as least-squares regression, and a Tikhonov regularization is generally incorporated, both to support stable estimation in the presence of noise and to prevent over-fitting. The degree of regularization is controlled by a single penalty-term parameter, which is often selected using the cross validation experimental methodology. In this paper, we generalize the simple regularization approach to admit a per-spectral-channel optimization setting, and a modified cross-validation procedure is developed. Experiments validate our method. Compared to the conventional regularization, our per-channel approach significantly improves the reconstruction accuracy at multiple spectral channels, by up to 17% increments for all the considered models
Learning Sets with Separating Kernels
We consider the problem of learning a set from random samples. We show how
relevant geometric and topological properties of a set can be studied
analytically using concepts from the theory of reproducing kernel Hilbert
spaces. A new kind of reproducing kernel, that we call separating kernel, plays
a crucial role in our study and is analyzed in detail. We prove a new analytic
characterization of the support of a distribution, that naturally leads to a
family of provably consistent regularized learning algorithms and we discuss
the stability of these methods with respect to random sampling. Numerical
experiments show that the approach is competitive, and often better, than other
state of the art techniques.Comment: final versio
Atomic norm denoising with applications to line spectral estimation
Motivated by recent work on atomic norms in inverse problems, we propose a
new approach to line spectral estimation that provides theoretical guarantees
for the mean-squared-error (MSE) performance in the presence of noise and
without knowledge of the model order. We propose an abstract theory of
denoising with atomic norms and specialize this theory to provide a convex
optimization problem for estimating the frequencies and phases of a mixture of
complex exponentials. We show that the associated convex optimization problem
can be solved in polynomial time via semidefinite programming (SDP). We also
show that the SDP can be approximated by an l1-regularized least-squares
problem that achieves nearly the same error rate as the SDP but can scale to
much larger problems. We compare both SDP and l1-based approaches with
classical line spectral analysis methods and demonstrate that the SDP
outperforms the l1 optimization which outperforms MUSIC, Cadzow's, and Matrix
Pencil approaches in terms of MSE over a wide range of signal-to-noise ratios.Comment: 27 pages, 10 figures. A preliminary version of this work appeared in
the Proceedings of the 49th Annual Allerton Conference in September 2011.
Numerous numerical experiments added to this version in accordance with
suggestions by anonymous reviewer
Non-convex regularization in remote sensing
In this paper, we study the effect of different regularizers and their
implications in high dimensional image classification and sparse linear
unmixing. Although kernelization or sparse methods are globally accepted
solutions for processing data in high dimensions, we present here a study on
the impact of the form of regularization used and its parametrization. We
consider regularization via traditional squared (2) and sparsity-promoting (1)
norms, as well as more unconventional nonconvex regularizers (p and Log Sum
Penalty). We compare their properties and advantages on several classification
and linear unmixing tasks and provide advices on the choice of the best
regularizer for the problem at hand. Finally, we also provide a fully
functional toolbox for the community.Comment: 11 pages, 11 figure
- …