269 research outputs found
A Kind of Affine Weighted Moment Invariants
A new kind of geometric invariants is proposed in this paper, which is called
affine weighted moment invariant (AWMI). By combination of local affine
differential invariants and a framework of global integral, they can more
effectively extract features of images and help to increase the number of
low-order invariants and to decrease the calculating cost. The experimental
results show that AWMIs have good stability and distinguishability and achieve
better results in image retrieval than traditional moment invariants. An
extension to 3D is straightforward
Rotation of 2D orthogonal polynomials
© 2017 Elsevier. This manuscript version is made available under the CC-BY-NC-ND 4.0 license: http://creativecommons.org/licenses/by-nc-nd/4.0/
This author accepted manuscript is made available following 24 month embargo from date of publication (Dec 2017) in accordance with the publisher’s archiving policyOrientation-independent object recognition mostly relies on rotation invariants. Invariants from moments orthogonal on a square have favorable numerical properties but they are difficult to construct. The paper presents sufficient and necessary conditions, that must be fulfilled by 2D separable orthogonal polynomials, for being transformed under rotation in the same way as are the monomials. If these conditions have been met, the rotation property propagates from polynomials to moments and allows a straightforward derivation of rotation invariants. We show that only orthogonal polynomials belonging to a specific class exhibit this property. We call them Hermite-like polynomials
Wavelet Analysis on the Sphere
The goal of this monograph is to develop the theory of wavelet harmonic analysis on the sphere. By starting with orthogonal polynomials and functional Hilbert spaces on the sphere, the foundations are laid for the study of spherical harmonics such as zonal functions. The book also discusses the construction of wavelet bases using special functions, especially Bessel, Hermite, Tchebychev, and Gegenbauer polynomials
Learning Theory and Approximation
Learning theory studies data structures from samples and aims at understanding unknown function relations behind them. This leads to interesting theoretical problems which can be often attacked with methods from Approximation Theory. This workshop - the second one of this type at the MFO - has concentrated on the following recent topics: Learning of manifolds and the geometry of data; sparsity and dimension reduction; error analysis and algorithmic aspects, including kernel based methods for regression and classification; application of multiscale aspects and of refinement algorithms to learning
- …