5,068 research outputs found
Robust Adaptive Median Binary Pattern for noisy texture classification and retrieval
Texture is an important cue for different computer vision tasks and
applications. Local Binary Pattern (LBP) is considered one of the best yet
efficient texture descriptors. However, LBP has some notable limitations,
mostly the sensitivity to noise. In this paper, we address these criteria by
introducing a novel texture descriptor, Robust Adaptive Median Binary Pattern
(RAMBP). RAMBP based on classification process of noisy pixels, adaptive
analysis window, scale analysis and image regions median comparison. The
proposed method handles images with high noisy textures, and increases the
discriminative properties by capturing microstructure and macrostructure
texture information. The proposed method has been evaluated on popular texture
datasets for classification and retrieval tasks, and under different high noise
conditions. Without any train or prior knowledge of noise type, RAMBP achieved
the best classification compared to state-of-the-art techniques. It scored more
than under impulse noise densities, more than under
Gaussian noised textures with standard deviation , and more than
under Gaussian blurred textures with standard deviation .
The proposed method yielded competitive results and high performance as one of
the best descriptors in noise-free texture classification. Furthermore, RAMBP
showed also high performance for the problem of noisy texture retrieval
providing high scores of recall and precision measures for textures with high
levels of noise
Provably scale-covariant networks from oriented quasi quadrature measures in cascade
This article presents a continuous model for hierarchical networks based on a
combination of mathematically derived models of receptive fields and
biologically inspired computations. Based on a functional model of complex
cells in terms of an oriented quasi quadrature combination of first- and
second-order directional Gaussian derivatives, we couple such primitive
computations in cascade over combinatorial expansions over image orientations.
Scale-space properties of the computational primitives are analysed and it is
shown that the resulting representation allows for provable scale and rotation
covariance. A prototype application to texture analysis is developed and it is
demonstrated that a simplified mean-reduced representation of the resulting
QuasiQuadNet leads to promising experimental results on three texture datasets.Comment: 12 pages, 3 figures, 1 tabl
Dynamic texture recognition using time-causal and time-recursive spatio-temporal receptive fields
This work presents a first evaluation of using spatio-temporal receptive
fields from a recently proposed time-causal spatio-temporal scale-space
framework as primitives for video analysis. We propose a new family of video
descriptors based on regional statistics of spatio-temporal receptive field
responses and evaluate this approach on the problem of dynamic texture
recognition. Our approach generalises a previously used method, based on joint
histograms of receptive field responses, from the spatial to the
spatio-temporal domain and from object recognition to dynamic texture
recognition. The time-recursive formulation enables computationally efficient
time-causal recognition. The experimental evaluation demonstrates competitive
performance compared to state-of-the-art. Especially, it is shown that binary
versions of our dynamic texture descriptors achieve improved performance
compared to a large range of similar methods using different primitives either
handcrafted or learned from data. Further, our qualitative and quantitative
investigation into parameter choices and the use of different sets of receptive
fields highlights the robustness and flexibility of our approach. Together,
these results support the descriptive power of this family of time-causal
spatio-temporal receptive fields, validate our approach for dynamic texture
recognition and point towards the possibility of designing a range of video
analysis methods based on these new time-causal spatio-temporal primitives.Comment: 29 pages, 16 figure
- âŠ