3 research outputs found
Fractional Local Neighborhood Intensity Pattern for Image Retrieval using Genetic Algorithm
In this paper, a new texture descriptor named "Fractional Local Neighborhood
Intensity Pattern" (FLNIP) has been proposed for content based image retrieval
(CBIR). It is an extension of the Local Neighborhood Intensity Pattern
(LNIP)[1]. FLNIP calculates the relative intensity difference between a
particular pixel and the center pixel of a 3x3 window by considering the
relationship with adjacent neighbors. In this work, the fractional change in
the local neighborhood involving the adjacent neighbors has been calculated
first with respect to one of the eight neighbors of the center pixel of a 3x3
window. Next, the fractional change has been calculated with respect to the
center itself. The two values of fractional change are next compared to
generate a binary bit pattern. Both sign and magnitude information are encoded
in a single descriptor as it deals with the relative change in magnitude in the
adjacent neighborhood i.e., the comparison of the fractional change. The
descriptor is applied on four multi-resolution images -- one being the raw
image and the other three being filtered gaussian images obtained by applying
gaussian filters of different standard deviations on the raw image to signify
the importance of exploring texture information at different resolutions in an
image. The four sets of distances obtained between the query and the target
image are then combined with a genetic algorithm based approach to improve the
retrieval performance by minimizing the distance between similar class images.
The performance of the method has been tested for image retrieval on four
popular databases. The precision and recall values observed on these databases
have been compared with recent state-of-art local patterns. The proposed method
has shown a significant improvement over many other existing methods.Comment: MTAP, Springer(Minor Revision
Local Neighborhood Intensity Pattern: A new texture feature descriptor for image retrieval
In this paper, a new texture descriptor based on the local neighborhood
intensity difference is proposed for content based image retrieval (CBIR). For
computation of texture features like Local Binary Pattern (LBP), the center
pixel in a 3*3 window of an image is compared with all the remaining neighbors,
one pixel at a time to generate a binary bit pattern. It ignores the effect of
the adjacent neighbors of a particular pixel for its binary encoding and also
for texture description. The proposed method is based on the concept that
neighbors of a particular pixel hold a significant amount of texture
information that can be considered for efficient texture representation for
CBIR. Taking this into account, we develop a new texture descriptor, named as
Local Neighborhood Intensity Pattern (LNIP) which considers the relative
intensity difference between a particular pixel and the center pixel by
considering its adjacent neighbors and generate a sign and a magnitude pattern.
Since sign and magnitude patterns hold complementary information to each other,
these two patterns are concatenated into a single feature descriptor to
generate a more concrete and useful feature descriptor. The proposed descriptor
has been tested for image retrieval on four databases, including three texture
image databases - Brodatz texture image database, MIT VisTex database and
Salzburg texture database and one face database AT&T face database. The
precision and recall values observed on these databases are compared with some
state-of-art local patterns. The proposed method showed a significant
improvement over many other existing methods.Comment: Expert Systems with Applications(Elsevier
A Novel Feature Descriptor for Image Retrieval by Combining Modified Color Histogram and Diagonally Symmetric Co-occurrence Texture Pattern
In this paper, we have proposed a novel feature descriptors combining color
and texture information collectively. In our proposed color descriptor
component, the inter-channel relationship between Hue (H) and Saturation (S)
channels in the HSV color space has been explored which was not done earlier.
We have quantized the H channel into a number of bins and performed the voting
with saturation values and vice versa by following a principle similar to that
of the HOG descriptor, where orientation of the gradient is quantized into a
certain number of bins and voting is done with gradient magnitude. This helps
us to study the nature of variation of saturation with variation in Hue and
nature of variation of Hue with the variation in saturation. The texture
component of our descriptor considers the co-occurrence relationship between
the pixels symmetric about both the diagonals of a 3x3 window. Our work is
inspired from the work done by Dubey et al.[1]. These two components, viz.
color and texture information individually perform better than existing texture
and color descriptors. Moreover, when concatenated the proposed descriptors
provide significant improvement over existing descriptors for content base
color image retrieval. The proposed descriptor has been tested for image
retrieval on five databases, including texture image databases - MIT VisTex
database and Salzburg texture database and natural scene databases Corel 1K,
Corel 5K and Corel 10K. The precision and recall values experimented on these
databases are compared with some state-of-art local patterns. The proposed
method provided satisfactory results from the experiments.Comment: Preprint Submitte