533 research outputs found

    Hierarchical Metric Learning for Optical Remote Sensing Scene Categorization

    Full text link
    We address the problem of scene classification from optical remote sensing (RS) images based on the paradigm of hierarchical metric learning. Ideally, supervised metric learning strategies learn a projection from a set of training data points so as to minimize intra-class variance while maximizing inter-class separability to the class label space. However, standard metric learning techniques do not incorporate the class interaction information in learning the transformation matrix, which is often considered to be a bottleneck while dealing with fine-grained visual categories. As a remedy, we propose to organize the classes in a hierarchical fashion by exploring their visual similarities and subsequently learn separate distance metric transformations for the classes present at the non-leaf nodes of the tree. We employ an iterative max-margin clustering strategy to obtain the hierarchical organization of the classes. Experiment results obtained on the large-scale NWPU-RESISC45 and the popular UC-Merced datasets demonstrate the efficacy of the proposed hierarchical metric learning based RS scene recognition strategy in comparison to the standard approaches.Comment: Undergoing revision in GRS

    Locality and Structure Regularized Low Rank Representation for Hyperspectral Image Classification

    Full text link
    Hyperspectral image (HSI) classification, which aims to assign an accurate label for hyperspectral pixels, has drawn great interest in recent years. Although low rank representation (LRR) has been used to classify HSI, its ability to segment each class from the whole HSI data has not been exploited fully yet. LRR has a good capacity to capture the underlying lowdimensional subspaces embedded in original data. However, there are still two drawbacks for LRR. First, LRR does not consider the local geometric structure within data, which makes the local correlation among neighboring data easily ignored. Second, the representation obtained by solving LRR is not discriminative enough to separate different data. In this paper, a novel locality and structure regularized low rank representation (LSLRR) model is proposed for HSI classification. To overcome the above limitations, we present locality constraint criterion (LCC) and structure preserving strategy (SPS) to improve the classical LRR. Specifically, we introduce a new distance metric, which combines both spatial and spectral features, to explore the local similarity of pixels. Thus, the global and local structures of HSI data can be exploited sufficiently. Besides, we propose a structure constraint to make the representation have a near block-diagonal structure. This helps to determine the final classification labels directly. Extensive experiments have been conducted on three popular HSI datasets. And the experimental results demonstrate that the proposed LSLRR outperforms other state-of-the-art methods.Comment: 14 pages, 7 figures, TGRS201

    State-Of-The-Art In Image Clustering Based On Affinity Propagation

    Get PDF
    Proclivity spread (AP) is a productive unsupervised grouping technique, which display a quick execution speed and discover bunches in a low mistake rate. AP calculation takes as info a similitude network that comprise of genuine esteemed likenesses between information focuses. The strategy iteratively trades genuine esteemed messages between sets of information focuses until a decent arrangement of models developed. The development of the comparability network dependent on the Euclidean separation is a significant stage during the time spent AP. Appropriately, the conventional Euclidean separation which is the summation of the pixel-wise force contrasts perform beneath normal when connected for picture grouping, as it endures of being reasonable to exceptions and even to little misshapening in pictures. Studies should be done on different methodologies from existing investigations especially in the field of picture grouping with different datasets. In this way, a sensible picture closeness metric will be researched to suite with datasets in the picture clustering field. As an end, changing the comparability lattice will prompt a superior clustering results

    Fuzzy Distance Measure Based Affinity Propagation Clustering

    Get PDF
    Affinity Propagation (AP) is an effective algorithm that find exemplars repeatedly exchange real valued messages between pairs of data points. AP uses the similarity between data points to calculate the messages. Hence, the construction of similarity is essential in the AP algorithm. A common choice for similarity is the negative Euclidean distance. However, due to the simplicity of Euclidean distance, it cannot capture the real structure of data. Furthermore, Euclidean distance is sensitive to noise and outliers such that the performance of the AP might be degraded. Therefore, researchers have intended to utilize different similarity measures to analyse the performance of AP. nonetheless, there is still a room to enhance the performance of AP clustering. A clustering method called fuzzy based Affinity propagation (F-AP) is proposed, which is based on a fuzzy similarity measure. Experiments shows the efficiency of the proposed F-AP, experiments is performed on UCI dataset. Results shows a promising improvement on AP

    Kernel Feature Extraction for Hyperspectral Image Classification Using Chunklet Constraints

    Get PDF
    A novel semi-supervised kernel feature extraction algorithm to combine an efficient metric learning method, i.e. relevant component analysis (RCA), and kernel trick is presented for hyperspectral imagery land-cover classification. This method obtains projection of the input data by learning an optimal nonlinear transformation via a chunklet constraints-based FDA criterion, and called chunklet-based kernel relevant component analysis (CKRCA). The proposed method is appealing as it constructs the kernel very intuitively for the RCA method and does not require any labeled information. The effectiveness of the proposed CKRCA is successfully illustrated in hyperspectral remote sensing image classification. Experimental results demonstrate that the proposed method can greatly improve the classification accuracy compared with traditional linear and conventional kernel-based methods

    Optimized kernel minimum noise fraction transformation for hyperspectral image classification

    Get PDF
    This paper presents an optimized kernel minimum noise fraction transformation (OKMNF) for feature extraction of hyperspectral imagery. The proposed approach is based on the kernel minimum noise fraction (KMNF) transformation, which is a nonlinear dimensionality reduction method. KMNF can map the original data into a higher dimensional feature space and provide a small number of quality features for classification and some other post processing. Noise estimation is an important component in KMNF. It is often estimated based on a strong relationship between adjacent pixels. However, hyperspectral images have limited spatial resolution and usually have a large number of mixed pixels, which make the spatial information less reliable for noise estimation. It is the main reason that KMNF generally shows unstable performance in feature extraction for classification. To overcome this problem, this paper exploits the use of a more accurate noise estimation method to improve KMNF. We propose two new noise estimation methods accurately. Moreover, we also propose a framework to improve noise estimation, where both spectral and spatial de-correlation are exploited. Experimental results, conducted using a variety of hyperspectral images, indicate that the proposed OKMNF is superior to some other related dimensionality reduction methods in most cases. Compared to the conventional KMNF, the proposed OKMNF benefits significant improvements in overall classification accuracy
    corecore