62,901 research outputs found

    Classification of ordered texture images using regression modelling and granulometric features

    Get PDF
    Structural information available from the granulometry of an image has been used widely in image texture analysis and classification. In this paper we present a method for classifying texture images which follow an intrinsic ordering of textures, using polynomial regression to express granulometric moments as a function of class label. Separate models are built for each individual moment and combined for back-prediction of the class label of a new image. The methodology was developed on synthetic images of evolving textures and tested using real images of 8 different grades of cut-tear-curl black tea leaves. For comparison, grey level co-occurrence (GLCM) based features were also computed, and both feature types were used in a range of classifiers including the regression approach. Experimental results demonstrate the superiority of the granulometric moments over GLCM-based features for classifying these tea images

    Modeling of evolving textures using granulometries

    Get PDF
    This chapter describes a statistical approach to classification of dynamic texture images, called parallel evolution functions (PEFs). Traditional classification methods predict texture class membership using comparisons with a finite set of predefined texture classes and identify the closest class. However, where texture images arise from a dynamic texture evolving over time, estimation of a time state in a continuous evolutionary process is required instead. The PEF approach does this using regression modeling techniques to predict time state. It is a flexible approach which may be based on any suitable image features. Many textures are well suited to a morphological analysis and the PEF approach uses image texture features derived from a granulometric analysis of the image. The method is illustrated using both simulated images of Boolean processes and real images of corrosion. The PEF approach has particular advantages for training sets containing limited numbers of observations, which is the case in many real world industrial inspection scenarios and for which other methods can fail or perform badly. [41] G.W. Horgan, Mathematical morphology for analysing soil structure from images, European Journal of Soil Science, vol. 49, pp. 161–173, 1998. [42] G.W. Horgan, C.A. Reid and C.A. Glasbey, Biological image processing and enhancement, Image Processing and Analysis, A Practical Approach, R. Baldock and J. Graham, eds., Oxford University Press, Oxford, UK, pp. 37–67, 2000. [43] B.B. Hubbard, The World According to Wavelets: The Story of a Mathematical Technique in the Making, A.K. Peters Ltd., Wellesley, MA, 1995. [44] H. Iversen and T. Lonnestad. An evaluation of stochastic models for analysis and synthesis of gray-scale texture, Pattern Recognition Letters, vol. 15, pp. 575–585, 1994. [45] A.K. Jain and F. Farrokhnia, Unsupervised texture segmentation using Gabor filters, Pattern Recognition, vol. 24(12), pp. 1167–1186, 1991. [46] T. Jossang and F. Feder, The fractal characterization of rough surfaces, Physica Scripta, vol. T44, pp. 9–14, 1992. [47] A.K. Katsaggelos and T. Chun-Jen, Iterative image restoration, Handbook of Image and Video Processing, A. Bovik, ed., Academic Press, London, pp. 208–209, 2000. [48] M. K¨oppen, C.H. Nowack and G. R¨osel, Pareto-morphology for color image processing, Proceedings of SCIA99, 11th Scandinavian Conference on Image Analysis 1, Kangerlussuaq, Greenland, pp. 195–202, 1999. [49] S. Krishnamachari and R. Chellappa, Multiresolution Gauss-Markov random field models for texture segmentation, IEEE Transactions on Image Processing, vol. 6(2), pp. 251–267, 1997. [50] T. Kurita and N. Otsu, Texture classification by higher order local autocorrelation features, Proceedings of ACCV93, Asian Conference on Computer Vision, Osaka, pp. 175–178, 1993. [51] S.T. Kyvelidis, L. Lykouropoulos and N. Kouloumbi, Digital system for detecting, classifying, and fast retrieving corrosion generated defects, Journal of Coatings Technology, vol. 73(915), pp. 67–73, 2001. [52] Y. Liu, T. Zhao and J. Zhang, Learning multispectral texture features for cervical cancer detection, Proceedings of 2002 IEEE International Symposium on Biomedical Imaging: Macro to Nano, pp. 169–172, 2002. [53] G. McGunnigle and M.J. Chantler, Modeling deposition of surface texture, Electronics Letters, vol. 37(12), pp. 749–750, 2001. [54] J. McKenzie, S. Marshall, A.J. Gray and E.R. Dougherty, Morphological texture analysis using the texture evolution function, International Journal of Pattern Recognition and Artificial Intelligence, vol. 17(2), pp. 167–185, 2003. [55] J. McKenzie, Classification of dynamically evolving textures using evolution functions, Ph.D. Thesis, University of Strathclyde, UK, 2004. [56] S.G. Mallat, Multiresolution approximations and wavelet orthonormal bases of L2(R), Transactions of the American Mathematical Society, vol. 315, pp. 69–87, 1989. [57] S.G. Mallat, A theory for multiresolution signal decomposition: the wavelet representation, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 11, pp. 674–693, 1989. [58] B.S. Manjunath and W.Y. Ma, Texture features for browsing and retrieval of image data, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, pp. 837–842, 1996. [59] B.S. Manjunath, G.M. Haley and W.Y. Ma, Multiband techniques for texture classification and segmentation, Handbook of Image and Video Processing, A. Bovik, ed., Academic Press, London, pp. 367–381, 2000. [60] G. Matheron, Random Sets and Integral Geometry, Wiley Series in Probability and Mathematical Statistics, John Wiley and Sons, New York, 1975

    Hierarchical aesthetic quality assessment using deep convolutional neural networks

    Get PDF
    Aesthetic image analysis has attracted much attention in recent years. However, assessing the aesthetic quality and assigning an aesthetic score are challenging problems. In this paper, we propose a novel framework for assessing the aesthetic quality of images. Firstly, we divide the images into three categories: “scene”, “object” and “texture”. Each category has an associated convolutional neural network (CNN) which learns the aesthetic features for the category in question. The object CNN is trained using the whole images and a salient region in each image. The texture CNN is trained using small regions in the original images. Furthermore, an A & C CNN is developed to simultaneously assess the aesthetic quality and identify the category for overall images. For each CNN, classification and regression models are developed separately to predict aesthetic class (high or low) and to assign an aesthetic score. Experimental results on a recently published large-scale dataset show that the proposed method can outperform the state-of-the-art methods for each category

    Perception Driven Texture Generation

    Full text link
    This paper investigates a novel task of generating texture images from perceptual descriptions. Previous work on texture generation focused on either synthesis from examples or generation from procedural models. Generating textures from perceptual attributes have not been well studied yet. Meanwhile, perceptual attributes, such as directionality, regularity and roughness are important factors for human observers to describe a texture. In this paper, we propose a joint deep network model that combines adversarial training and perceptual feature regression for texture generation, while only random noise and user-defined perceptual attributes are required as input. In this model, a preliminary trained convolutional neural network is essentially integrated with the adversarial framework, which can drive the generated textures to possess given perceptual attributes. An important aspect of the proposed model is that, if we change one of the input perceptual features, the corresponding appearance of the generated textures will also be changed. We design several experiments to validate the effectiveness of the proposed method. The results show that the proposed method can produce high quality texture images with desired perceptual properties.Comment: 7 pages, 4 figures, icme201
    corecore