2,700 research outputs found

    Construction of Hilbert Transform Pairs of Wavelet Bases and Gabor-like Transforms

    Get PDF
    We propose a novel method for constructing Hilbert transform (HT) pairs of wavelet bases based on a fundamental approximation-theoretic characterization of scaling functions--the B-spline factorization theorem. In particular, starting from well-localized scaling functions, we construct HT pairs of biorthogonal wavelet bases of L^2(R) by relating the corresponding wavelet filters via a discrete form of the continuous HT filter. As a concrete application of this methodology, we identify HT pairs of spline wavelets of a specific flavor, which are then combined to realize a family of complex wavelets that resemble the optimally-localized Gabor function for sufficiently large orders. Analytic wavelets, derived from the complexification of HT wavelet pairs, exhibit a one-sided spectrum. Based on the tensor-product of such analytic wavelets, and, in effect, by appropriately combining four separable biorthogonal wavelet bases of L^2(R^2), we then discuss a methodology for constructing 2D directional-selective complex wavelets. In particular, analogous to the HT correspondence between the components of the 1D counterpart, we relate the real and imaginary components of these complex wavelets using a multi-dimensional extension of the HT--the directional HT. Next, we construct a family of complex spline wavelets that resemble the directional Gabor functions proposed by Daugman. Finally, we present an efficient FFT-based filterbank algorithm for implementing the associated complex wavelet transform.Comment: 36 pages, 8 figure

    The Kontorovich-Lebedev transform as a map between dd-orthogonal polynomials

    Full text link
    A slight modification of the Kontorovich-Lebedev transform is an automorphism on the vector space of polynomials. The action of this KLαKL_{\alpha}-transform over certain polynomial sequences will be under discussion, and a special attention will be given the d-orthogonal ones. For instance, the Continuous Dual Hahn polynomials appear as the KLαKL_{\alpha}-transform of a 2-orthogonal sequence of Laguerre type. Finally, all the orthogonal polynomial sequences whose KLαKL_{\alpha}-transform is a dd-orthogonal sequence will be characterized: they are essencially semiclassical polynomials fulfilling particular conditions and dd is even. The Hermite and Laguerre polynomials are the classical solutions to this problem.Comment: 27 page

    Gibbs Sampling, Exponential Families and Orthogonal Polynomials

    Full text link
    We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical orthogonal polynomials as eigenfunctions.Comment: This paper commented in: [arXiv:0808.3855], [arXiv:0808.3856], [arXiv:0808.3859], [arXiv:0808.3861]. Rejoinder in [arXiv:0808.3864]. Published in at http://dx.doi.org/10.1214/07-STS252 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Robust face recognition using convolutional neural networks combined with Krawtchouk moments

    Get PDF
    Face recognition is a challenging task due to the complexity of pose variations, occlusion and the variety of face expressions performed by distinct subjects. Thus, many features have been proposed, however each feature has its own drawbacks. Therefore, in this paper, we propose a robust model called Krawtchouk moments convolutional neural networks (KMCNN) for face recognition. Our model is divided into two main steps. Firstly, we use 2D discrete orthogonal Krawtchouk moments to represent features. Then, we fed it into convolutional neural networks (CNN) for classification. The main goal of the proposed approach is to improve the classification accuracy of noisy grayscale face images. In fact, Krawtchouk moments are less sensitive to noisy effects. Moreover, they can extract pertinent features from an image using only low orders. To investigate the robustness of the proposed approach, two types of noise (salt and pepper and speckle) are added to three datasets (YaleB extended, our database of faces (ORL), and a subset of labeled faces in the wild (LFW)). Experimental results show that KMCNN is flexible and performs significantly better than using just CNN or when we combine it with other discrete moments such as Tchebichef, Hahn, Racah moments in most densities of noises
    • …
    corecore