227 research outputs found
Crossings of smooth shot noise processes
In this paper, we consider smooth shot noise processes and their expected
number of level crossings. When the kernel response function is sufficiently
smooth, the mean number of crossings function is obtained through an integral
formula. Moreover, as the intensity increases, or equivalently, as the number
of shots becomes larger, a normal convergence to the classical Rice's formula
for Gaussian processes is obtained. The Gaussian kernel function, that
corresponds to many applications in physics, is studied in detail and two
different regimes are exhibited.Comment: Published in at http://dx.doi.org/10.1214/11-AAP807 the Annals of
Applied Probability ( http://www.imstat.org/aap/ ) by the Institute of
Mathematical Statistics (http://www.imstat.org
Image denoising by statistical area thresholding
Area openings and closings are morphological filters which efficiently
suppress impulse noise from an image, by removing small connected components of
level sets. The problem of an objective choice of threshold for the area
remains open. Here, a mathematical model for random images will be considered.
Under this model, a Poisson approximation for the probability of appearance of
any local pattern can be computed. In particular, the probability of observing
a component with size larger than in pure impulse noise has an explicit
form. This permits the definition of a statistical test on the significance of
connected components, thus providing an explicit formula for the area threshold
of the denoising filter, as a function of the impulse noise probability
parameter. Finally, using threshold decomposition, a denoising algorithm for
grey level images is proposed
A survey of exemplar-based texture synthesis
Exemplar-based texture synthesis is the process of generating, from an input
sample, new texture images of arbitrary size and which are perceptually
equivalent to the sample. The two main approaches are statistics-based methods
and patch re-arrangement methods. In the first class, a texture is
characterized by a statistical signature; then, a random sampling conditioned
to this signature produces genuinely different texture images. The second class
boils down to a clever "copy-paste" procedure, which stitches together large
regions of the sample. Hybrid methods try to combine ideas from both approaches
to avoid their hurdles. The recent approaches using convolutional neural
networks fit to this classification, some being statistical and others
performing patch re-arrangement in the feature space. They produce impressive
synthesis on various kinds of textures. Nevertheless, we found that most real
textures are organized at multiple scales, with global structures revealed at
coarse scales and highly varying details at finer ones. Thus, when confronted
with large natural images of textures the results of state-of-the-art methods
degrade rapidly, and the problem of modeling them remains wide open.Comment: v2: Added comments and typos fixes. New section added to describe
FRAME. New method presented: CNNMR
Significant edges in the case of a non-stationary Gaussian noise
In this paper, we propose an edge detection technique based on some local
smoothing of the image followed by a statistical hypothesis testing on the
gradient. An edge point being defined as a zero-crossing of the Laplacian, it
is said to be a significant edge point if the gradient at this point is larger
than a threshold s(\eps) defined by: if the image is pure noise, then
\P(\norm{\nabla I}\geq s(\eps) \bigm| \Delta I = 0) \leq\eps. In other words,
a significant edge is an edge which has a very low probability to be there
because of noise. We will show that the threshold s(\eps) can be explicitly
computed in the case of a stationary Gaussian noise. In images we are
interested in, which are obtained by tomographic reconstruction from a
radiograph, this method fails since the Gaussian noise is not stationary
anymore. But in this case again, we will be able to give the law of the
gradient conditionally on the zero-crossing of the Laplacian, and thus compute
the threshold s(\eps). We will end this paper with some experiments and
compare the results with the ones obtained with some other methods of edge
detection
Mean Geometry for 2D random fields: level perimeter and level total curvature integrals
International audienceWe introduce the level perimeter integral and the total curvature integral associated with areal valued function f defined on the plane R^2 as integrals allowing to compute the perimeter of the excursion set of f above level t and the total (signed) curvature of itsboundary for almost every level t. Thanks to the Gauss-Bonnet theorem, the total curvature is directly related to theEuler Characteristic of the excursion set. We show that the level perimeter and the total curvature integrals can be explicitly computed in two different frameworks: smooth (at least C^2)functions and piecewise constant functions (also called here elementary functions). Considering 2D random fields (in particular shot noise random fields), we compute their mean perimeter and total curvature integrals, and this provides new explicit computations of the mean perimeter and Euler Characteristic densities of excursion sets, beyond the Gaussian framework
A Wasserstein-type distance in the space of Gaussian Mixture Models
In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture models. This distance is defined by restricting the set of possible coupling measures in the optimal transport problem to Gaussian mixture models. We derive a very simple discrete formulation for this distance, which makes it suitable for high dimensional problems. We also study the corresponding multi-marginal and barycenter formulations. We show some properties of this Wasserstein-type distance, and we illustrate its practical use with some examples in image processing
- …