6,342 research outputs found
Outlier robust corner-preserving methods for reconstructing noisy images
The ability to remove a large amount of noise and the ability to preserve
most structure are desirable properties of an image smoother. Unfortunately,
they usually seem to be at odds with each other; one can only improve one
property at the cost of the other. By combining M-smoothing and
least-squares-trimming, the TM-smoother is introduced as a means to unify
corner-preserving properties and outlier robustness. To identify edge- and
corner-preserving properties, a new theory based on differential geometry is
developed. Further, robustness concepts are transferred to image processing. In
two examples, the TM-smoother outperforms other corner-preserving smoothers. A
software package containing both the TM- and the M-smoother can be downloaded
from the Internet.Comment: Published at http://dx.doi.org/10.1214/009053606000001109 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
A review of domain adaptation without target labels
Domain adaptation has become a prominent problem setting in machine learning
and related fields. This review asks the question: how can a classifier learn
from a source domain and generalize to a target domain? We present a
categorization of approaches, divided into, what we refer to as, sample-based,
feature-based and inference-based methods. Sample-based methods focus on
weighting individual observations during training based on their importance to
the target domain. Feature-based methods revolve around on mapping, projecting
and representing features such that a source classifier performs well on the
target domain and inference-based methods incorporate adaptation into the
parameter estimation procedure, for instance through constraints on the
optimization procedure. Additionally, we review a number of conditions that
allow for formulating bounds on the cross-domain generalization error. Our
categorization highlights recurring ideas and raises questions important to
further research.Comment: 20 pages, 5 figure
Direct Ensemble Estimation of Density Functionals
Estimating density functionals of analog sources is an important problem in
statistical signal processing and information theory. Traditionally, estimating
these quantities requires either making parametric assumptions about the
underlying distributions or using non-parametric density estimation followed by
integration. In this paper we introduce a direct nonparametric approach which
bypasses the need for density estimation by using the error rates of k-NN
classifiers asdata-driven basis functions that can be combined to estimate a
range of density functionals. However, this method is subject to a non-trivial
bias that dramatically slows the rate of convergence in higher dimensions. To
overcome this limitation, we develop an ensemble method for estimating the
value of the basis function which, under some minor constraints on the
smoothness of the underlying distributions, achieves the parametric rate of
convergence regardless of data dimension.Comment: 5 page
Learning how to be robust: Deep polynomial regression
Polynomial regression is a recurrent problem with a large number of
applications. In computer vision it often appears in motion analysis. Whatever
the application, standard methods for regression of polynomial models tend to
deliver biased results when the input data is heavily contaminated by outliers.
Moreover, the problem is even harder when outliers have strong structure.
Departing from problem-tailored heuristics for robust estimation of parametric
models, we explore deep convolutional neural networks. Our work aims to find a
generic approach for training deep regression models without the explicit need
of supervised annotation. We bypass the need for a tailored loss function on
the regression parameters by attaching to our model a differentiable hard-wired
decoder corresponding to the polynomial operation at hand. We demonstrate the
value of our findings by comparing with standard robust regression methods.
Furthermore, we demonstrate how to use such models for a real computer vision
problem, i.e., video stabilization. The qualitative and quantitative
experiments show that neural networks are able to learn robustness for general
polynomial regression, with results that well overpass scores of traditional
robust estimation methods.Comment: 18 pages, conferenc
Nonlocal Myriad Filters for Cauchy Noise Removal
The contribution of this paper is two-fold. First, we introduce a generalized
myriad filter, which is a method to compute the joint maximum likelihood
estimator of the location and the scale parameter of the Cauchy distribution.
Estimating only the location parameter is known as myriad filter. We propose an
efficient algorithm to compute the generalized myriad filter and prove its
convergence. Special cases of this algorithm result in the classical myriad
filtering, respective an algorithm for estimating only the scale parameter.
Based on an asymptotic analysis, we develop a second, even faster generalized
myriad filtering technique.
Second, we use our new approaches within a nonlocal, fully unsupervised
method to denoise images corrupted by Cauchy noise. Special attention is paid
to the determination of similar patches in noisy images. Numerical examples
demonstrate the excellent performance of our algorithms which have moreover the
advantage to be robust with respect to the parameter choice
Robust Orthogonal Complement Principal Component Analysis
Recently, the robustification of principal component analysis has attracted
lots of attention from statisticians, engineers and computer scientists. In
this work we study the type of outliers that are not necessarily apparent in
the original observation space but can seriously affect the principal subspace
estimation. Based on a mathematical formulation of such transformed outliers, a
novel robust orthogonal complement principal component analysis (ROC-PCA) is
proposed. The framework combines the popular sparsity-enforcing and low rank
regularization techniques to deal with row-wise outliers as well as
element-wise outliers. A non-asymptotic oracle inequality guarantees the
accuracy and high breakdown performance of ROC-PCA in finite samples. To tackle
the computational challenges, an efficient algorithm is developed on the basis
of Stiefel manifold optimization and iterative thresholding. Furthermore, a
batch variant is proposed to significantly reduce the cost in ultra high
dimensions. The paper also points out a pitfall of a common practice of SVD
reduction in robust PCA. Experiments show the effectiveness and efficiency of
ROC-PCA in both synthetic and real data
- …