1,577 research outputs found

    A novel coarse-to-fine remote sensing image retrieval system in JPEG-2000 compressed domain

    Get PDF
    Copyright 2018 Society of Photo‑Optical Instrumentation Engineers (SPIE). One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this publication for a fee or for commercial purposes, and modification of the contents of the publication are prohibited.This paper presents a novel content-based image search and retrieval (CBIR) system that achieves coarse to fine remote sensing (RS) image description and retrieval in JPEG 2000 compressed domain. The proposed system initially: i) decodes the code-streams associated to the coarse (i.e., the lowest) wavelet resolution, and ii) discards the most irrelevant images to the query image that are selected based on the similarities estimated among the coarse resolution features of the query image and those of the archive images. Then, the code-streams associated to the sub-sequent resolution of the remaining images in the archive are decoded and the most irrelevant images are selected by considering the features associated to both resolutions. This is achieved by estimating the similarities between the query image and remaining images by giving higher weights to the features associated to the finer resolution while assigning lower weights to those related to the coarse resolution. To this end, the pyramid match kernel similarity measure is exploited. These processes are iterated until the code-streams associated to the highest wavelet resolution are decoded only for a very small set of images. By this way, the proposed system exploits a multiresolution and hierarchical feature space and accomplish an adaptive RS CBIR with significantly reduced retrieval time. Experimental results obtained on an archive of aerial images confirm the effectiveness of the proposed system in terms of retrieval accuracy and time when compared to the standard CBIR systems

    Wavelet based similarity measurement algorithm for seafloor morphology

    Get PDF
    Thesis (S.M. in Naval Architecture and Marine Engineering and S.M. in Mechanical Engineering)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2006.Includes bibliographical references (leaves 71-73).The recent expansion of systematic seafloor exploration programs such as geophysical research, seafloor mapping, search and survey, resource assessment and other scientific, commercial and military applications has created a need for rapid and robust methods of processing seafloor imagery. Given the existence of a large library of seafloor images, a fast automated image classifier algorithm is needed to determine changes in seabed morphology over time. The focus of this work is the development of a robust Similarity Measurement (SM) algorithm to address the above problem. Our work uses a side-scan sonar image library for experimentation and testing. Variations of an underwater vehicle's height above the sea floor and of its pitch and roll angles cause distortion in the data obtained, such that transformations to align the data should include rotation, translation, anisotropic scaling and skew. In order to deal with these problems, we propose to use the Wavelet transform for similarity detection. Wavelets have been widely used during the last three decades in image processing. Since the Wavelet transform allows a multi-resolution decomposition, it is easier to identify the similarities between two images by examining the energy distribution at each decomposition level.(cont.) The energy distribution in the frequency domain at the output of the high pass and low pass filter banks identifies the texture discrimination. Our approach uses a statistical framework, involving fitting the Wavelet coefficients into a generalized Gaussian density distribution. The next step involves use of the Kullback-Leibner entropy metric to measure the distance between Wavelet coefficient distributions. To select the top N most likely matching images, the database images are ranked based on the minimum Kullback-Leibner distance. The statistical approach is effective in eliminating rotation, mis-registration and skew problems by working in the Wavelet domain. It's recommended that further work focuses on choosing the best Wavelet packet to increase the robustness of the algorithm developed in this thesis.by Ilkay Darilmaz.S.M.in Naval Architecture and Marine Engineering and S.M.in Mechanical Engineerin

    Fast unsupervised multiresolution color image segmentation using adaptive gradient thresholding and progressive region growing

    Get PDF
    In this thesis, we propose a fast unsupervised multiresolution color image segmentation algorithm which takes advantage of gradient information in an adaptive and progressive framework. This gradient-based segmentation method is initialized by a vector gradient calculation on the full resolution input image in the CIE L*a*b* color space. The resultant edge map is used to adaptively generate thresholds for classifying regions of varying gradient densities at different levels of the input image pyramid, obtained through a dyadic wavelet decomposition scheme. At each level, the classification obtained by a progressively thresholded growth procedure is combined with an entropy-based texture model in a statistical merging procedure to obtain an interim segmentation. Utilizing an association of a gradient quantized confidence map and non-linear spatial filtering techniques, regions of high confidence are passed from one level to another until the full resolution segmentation is achieved. Evaluation of our results on several hundred images using the Normalized Probabilistic Rand (NPR) Index shows that our algorithm outperforms state-of the art segmentation techniques and is much more computationally efficient than its single scale counterpart, with comparable segmentation quality

    Geodesics on the manifold of multivariate generalized Gaussian distributions with an application to multicomponent texture discrimination

    Get PDF
    We consider the Rao geodesic distance (GD) based on the Fisher information as a similarity measure on the manifold of zero-mean multivariate generalized Gaussian distributions (MGGD). The MGGD is shown to be an adequate model for the heavy-tailed wavelet statistics in multicomponent images, such as color or multispectral images. We discuss the estimation of MGGD parameters using various methods. We apply the GD between MGGDs to color texture discrimination in several classification experiments, taking into account the correlation structure between the spectral bands in the wavelet domain. We compare the performance, both in terms of texture discrimination capability and computational load, of the GD and the Kullback-Leibler divergence (KLD). Likewise, both uni- and multivariate generalized Gaussian models are evaluated, characterized by a fixed or a variable shape parameter. The modeling of the interband correlation significantly improves classification efficiency, while the GD is shown to consistently outperform the KLD as a similarity measure

    Multi Voxel Descriptor for 3D Texture Retrieval

    Get PDF
    In this paper, we present a new feature descriptors  which exploit voxels for 3D textured retrieval system when models vary either by geometric shape or texture or both. First, we perform pose normalisation to modify arbitrary 3D models  in order to have same orientation. We then map the structure of 3D models into voxels. This purposes to make all the 3D models have the same dimensions. Through this voxels, we can capture information from a number of ways.  First, we build biner voxel histogram and color voxel histogram.  Second, we compute distance from centre voxel into other voxels and generate histogram. Then we also compute fourier transform in spectral space.  For capturing texture feature, we apply voxel tetra pattern. Finally, we merge all features by linear combination. For experiment, we use standard evaluation measures such as Nearest Neighbor (NN), First Tier (FT), Second Tier (ST), Average Dynamic Recall (ADR). Dataset in SHREC 2014  and its evaluation program is used to verify the proposed method. Experiment result show that the proposed method  is more accurate when compared with some methods of state-of-the-art

    Content-based image retrieval of museum images

    Get PDF
    Content-based image retrieval (CBIR) is becoming more and more important with the advance of multimedia and imaging technology. Among many retrieval features associated with CBIR, texture retrieval is one of the most difficult. This is mainly because no satisfactory quantitative definition of texture exists at this time, and also because of the complex nature of the texture itself. Another difficult problem in CBIR is query by low-quality images, which means attempts to retrieve images using a poor quality image as a query. Not many content-based retrieval systems have addressed the problem of query by low-quality images. Wavelet analysis is a relatively new and promising tool for signal and image analysis. Its time-scale representation provides both spatial and frequency information, thus giving extra information compared to other image representation schemes. This research aims to address some of the problems of query by texture and query by low quality images by exploiting all the advantages that wavelet analysis has to offer, particularly in the context of museum image collections. A novel query by low-quality images algorithm is presented as a solution to the problem of poor retrieval performance using conventional methods. In the query by texture problem, this thesis provides a comprehensive evaluation on wavelet-based texture method as well as comparison with other techniques. A novel automatic texture segmentation algorithm and an improved block oriented decomposition is proposed for use in query by texture. Finally all the proposed techniques are integrated in a content-based image retrieval application for museum image collections

    An Integrated Content and Metadata based Retrieval System for Art

    No full text
    In this paper we describe aspects of the Artiste project to develop a distributed content and metadata based analysis, retrieval and navigation system for a number of major European Museums. In particular, after a brief overview of the complete system, we describe the design and evaluation of some of the image analysis algorithms developed to meet the specific requirements of the users from the museums. These include a method for retrievals based on sub images, retrievals based on very low quality images and retrieval using craquelure type

    Wavelets and Imaging Informatics: A Review of the Literature

    Get PDF
    AbstractModern medicine is a field that has been revolutionized by the emergence of computer and imaging technology. It is increasingly difficult, however, to manage the ever-growing enormous amount of medical imaging information available in digital formats. Numerous techniques have been developed to make the imaging information more easily accessible and to perform analysis automatically. Among these techniques, wavelet transforms have proven prominently useful not only for biomedical imaging but also for signal and image processing in general. Wavelet transforms decompose a signal into frequency bands, the width of which are determined by a dyadic scheme. This particular way of dividing frequency bands matches the statistical properties of most images very well. During the past decade, there has been active research in applying wavelets to various aspects of imaging informatics, including compression, enhancements, analysis, classification, and retrieval. This review represents a survey of the most significant practical and theoretical advances in the field of wavelet-based imaging informatics
    • …
    corecore