2,025 research outputs found
Exploiting Image Local And Nonlocal Consistency For Mixed Gaussian-Impulse Noise Removal
Most existing image denoising algorithms can only deal with a single type of
noise, which violates the fact that the noisy observed images in practice are
often suffered from more than one type of noise during the process of
acquisition and transmission. In this paper, we propose a new variational
algorithm for mixed Gaussian-impulse noise removal by exploiting image local
consistency and nonlocal consistency simultaneously. Specifically, the local
consistency is measured by a hyper-Laplace prior, enforcing the local
smoothness of images, while the nonlocal consistency is measured by
three-dimensional sparsity of similar blocks, enforcing the nonlocal
self-similarity of natural images. Moreover, a Split-Bregman based technique is
developed to solve the above optimization problem efficiently. Extensive
experiments for mixed Gaussian plus impulse noise show that significant
performance improvements over the current state-of-the-art schemes have been
achieved, which substantiates the effectiveness of the proposed algorithm.Comment: 6 pages, 4 figures, 3 tables, to be published at IEEE Int. Conf. on
Multimedia & Expo (ICME) 201
Scalable k-Means Clustering via Lightweight Coresets
Coresets are compact representations of data sets such that models trained on
a coreset are provably competitive with models trained on the full data set. As
such, they have been successfully used to scale up clustering models to massive
data sets. While existing approaches generally only allow for multiplicative
approximation errors, we propose a novel notion of lightweight coresets that
allows for both multiplicative and additive errors. We provide a single
algorithm to construct lightweight coresets for k-means clustering as well as
soft and hard Bregman clustering. The algorithm is substantially faster than
existing constructions, embarrassingly parallel, and the resulting coresets are
smaller. We further show that the proposed approach naturally generalizes to
statistical k-means clustering and that, compared to existing results, it can
be used to compute smaller summaries for empirical risk minimization. In
extensive experiments, we demonstrate that the proposed algorithm outperforms
existing data summarization strategies in practice.Comment: To appear in the 24th ACM SIGKDD International Conference on
Knowledge Discovery & Data Mining (KDD
Curse of dimensionality reduction in max-plus based approximation methods: theoretical estimates and improved pruning algorithms
Max-plus based methods have been recently developed to approximate the value
function of possibly high dimensional optimal control problems. A critical step
of these methods consists in approximating a function by a supremum of a small
number of functions (max-plus "basis functions") taken from a prescribed
dictionary. We study several variants of this approximation problem, which we
show to be continuous versions of the facility location and -center
combinatorial optimization problems, in which the connection costs arise from a
Bregman distance. We give theoretical error estimates, quantifying the number
of basis functions needed to reach a prescribed accuracy. We derive from our
approach a refinement of the curse of dimensionality free method introduced
previously by McEneaney, with a higher accuracy for a comparable computational
cost.Comment: 8pages 5 figure
- …