202 research outputs found

    Sketch-based subspace clustering of hyperspectral images

    Get PDF
    Sparse subspace clustering (SSC) techniques provide the state-of-the-art in clustering of hyperspectral images (HSIs). However, their computational complexity hinders their applicability to large-scale HSIs. In this paper, we propose a large-scale SSC-based method, which can effectively process large HSIs while also achieving improved clustering accuracy compared to the current SSC methods. We build our approach based on an emerging concept of sketched subspace clustering, which was to our knowledge not explored at all in hyperspectral imaging yet. Moreover, there are only scarce results on any large-scale SSC approaches for HSI. We show that a direct application of sketched SSC does not provide a satisfactory performance on HSIs but it does provide an excellent basis for an effective and elegant method that we build by extending this approach with a spatial prior and deriving the corresponding solver. In particular, a random matrix constructed by the Johnson-Lindenstrauss transform is first used to sketch the self-representation dictionary as a compact dictionary, which significantly reduces the number of sparse coefficients to be solved, thereby reducing the overall complexity. In order to alleviate the effect of noise and within-class spectral variations of HSIs, we employ a total variation constraint on the coefficient matrix, which accounts for the spatial dependencies among the neighbouring pixels. We derive an efficient solver for the resulting optimization problem, and we theoretically prove its convergence property under mild conditions. The experimental results on real HSIs show a notable improvement in comparison with the traditional SSC-based methods and the state-of-the-art methods for clustering of large-scale images

    Landmark-based large-scale sparse subspace clustering method for hyperspectral images

    Get PDF
    Sparse subspace clustering (SSC) has achieved the state-of-the-art performance in the clustering of hyperspectral images (HSIs). However, the high computational complexity and sensitivity to noise limit its clustering performance. In this paper, we propose a scalable SSC method for the large-scale HSIs, which significantly accelerates the clustering speed of SSC without sacrificing clustering accuracy. A small landmark dictionary is first generated by applying k-means to the original data, which results in the significant reduction of the number of optimization variables in terms of sparse matrix. In addition, we incorporate spatial regularization based on total variation (TV) and improve this way strongly robustness to noise. A landmark-based spectral clustering method is applied to the obtained sparse matrix, which further improves the clustering speed. Experimental results on two real HSIs demonstrate the effectiveness of the proposed method and the superior performance compared to both traditional SSC-based methods and the related large-scale clustering methods

    Sketched sparse subspace clustering for large-scale hyperspectral images

    Get PDF
    Sparse subspace clustering (SSC) has achieved the state-of-the-art performance in clustering of hyperspectral images. However, the computational complexity of SSC-based methods is prohibitive for large-scale problems. We propose a large-scale SSC-based method, which processes efficiently large-scale HSIs without sacrificing the clustering accuracy. The proposed approach incorporates sketching of the self-representation dictionary reducing thereby largely the number of optimization variables. In addition, we employ a total variation (TV) regularization of the sparse matrix, resulting in a robust sparse representation. We derive a solver based on the alternating direction method of multipliers (ADMM) for the resulting optimization problem. Experimental results on real data show improvements over the traditional SSC-based methods in terms of accuracy and running time

    RMSC: Robust Modeling of Subspace Clustering for high dimensional data

    Get PDF
    Subspace clustering is one of the active research problem associated with high-dimensional data. Here some of the standard techniques are reviewed to investigate existing methodologies. Although, there have been various forms of research techniques evolved recently, they do not completely mitigate the problems pertaining to noise sustainability and optimization of clustering accuracy. Hence, a novel technique called as Robust Modeling of Subspace Clustering (RMSC) presented to solve the above problem. An analytical research methodology is used to formulate two algorithms for computing outliers and for extracting elite subspace from the highdimensional data inflicted by different forms of noise. RMSC was found to offer higher accuracy and lower error rate both in presence of noise and absence of noise over high-dimensional data. © 2017 IEEE

    Simplified Energy Landscape for Modularity Using Total Variation

    Get PDF
    Networks capture pairwise interactions between entities and are frequently used in applications such as social networks, food networks, and protein interaction networks, to name a few. Communities, cohesive groups of nodes, often form in these applications, and identifying them gives insight into the overall organization of the network. One common quality function used to identify community structure is modularity. In Hu et al. [SIAM J. App. Math., 73(6), 2013], it was shown that modularity optimization is equivalent to minimizing a particular nonconvex total variation (TV) based functional over a discrete domain. They solve this problem, assuming the number of communities is known, using a Merriman, Bence, Osher (MBO) scheme. We show that modularity optimization is equivalent to minimizing a convex TV-based functional over a discrete domain, again, assuming the number of communities is known. Furthermore, we show that modularity has no convex relaxation satisfying certain natural conditions. We therefore, find a manageable non-convex approximation using a Ginzburg Landau functional, which provably converges to the correct energy in the limit of a certain parameter. We then derive an MBO algorithm with fewer hand-tuned parameters than in Hu et al. and which is 7 times faster at solving the associated diffusion equation due to the fact that the underlying discretization is unconditionally stable. Our numerical tests include a hyperspectral video whose associated graph has 2.9x10^7 edges, which is roughly 37 times larger than was handled in the paper of Hu et al.Comment: 25 pages, 3 figures, 3 tables, submitted to SIAM J. App. Mat
    corecore