3,446 research outputs found

    Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images

    Get PDF
    Multimodal remote sensing is an upcoming field as it allows for many views of the same region of interest. Domain adaption attempts to fuse these multimodal remotely sensed images by utilizing the concept of transfer learning to understand data from different sources to learn a fused outcome. Semisupervised Manifold Alignment (SSMA) maps multiple Hyperspectral images (HSIs) from high dimensional source spaces to a low dimensional latent space where similar elements reside closely together. SSMA preserves the original geometric structure of respective HSIs whilst pulling similar data points together and pushing dissimilar data points apart. The SSMA algorithm is comprised of a geometric component, a similarity component and dissimilarity component. The geometric component of the SSMA method has roots in the original Laplacian Eigenmaps (LE) dimension reduction algorithm and the projection functions have roots in the original Locality Preserving Projections (LPP) dimensionality reduction framework. The similarity and dissimilarity component is a semisupervised component that allows expert labeled information to improve the image fusion process. Spatial-Spectral Schroedinger Eigenmaps (SSSE) was designed as a semisupervised enhancement to the LE algorithm by augmenting the Laplacian matrix with a user-defined potential function. However, the user-defined enhancement has yet to be explored in the LPP framework. The first part of this thesis proposes to use the Spatial-Spectral potential within the LPP algorithm, creating a new algorithm we call the Schroedinger Eigenmap Projections (SEP). Through experiments on publicly available data with expert-labeled ground truth, we perform experiments to compare the performance of the SEP algorithm with respect to the LPP algorithm. The second part of this thesis proposes incorporating the Spatial Spectral potential from SSSE into the SSMA framework. Using two multi-angled HSI’s, we explore the impact of incorporating this potential into SSMA

    Convergence results for projected line-search methods on varieties of low-rank matrices via \L{}ojasiewicz inequality

    Full text link
    The aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety M≤k\mathcal{M}_{\le k} of real m×nm \times n matrices of rank at most kk. Such methods extend Riemannian optimization methods, which are successfully used on the smooth manifold Mk\mathcal{M}_k of rank-kk matrices, to its closure by taking steps along gradient-related directions in the tangent cone, and afterwards projecting back to M≤k\mathcal{M}_{\le k}. Considering such a method circumvents the difficulties which arise from the nonclosedness and the unbounded curvature of Mk\mathcal{M}_k. The pointwise convergence is obtained for real-analytic functions on the basis of a \L{}ojasiewicz inequality for the projection of the antigradient to the tangent cone. If the derived limit point lies on the smooth part of M≤k\mathcal{M}_{\le k}, i.e. in Mk\mathcal{M}_k, this boils down to more or less known results, but with the benefit that asymptotic convergence rate estimates (for specific step-sizes) can be obtained without an a priori curvature bound, simply from the fact that the limit lies on a smooth manifold. At the same time, one can give a convincing justification for assuming critical points to lie in Mk\mathcal{M}_k: if XX is a critical point of ff on M≤k\mathcal{M}_{\le k}, then either XX has rank kk, or ∇f(X)=0\nabla f(X) = 0
    • …
    corecore