1,344,445 research outputs found

    Colour image processing and texture analysis on images of porterhouse steak meat

    Get PDF
    This paper outlines two colour image processing and texture analysis techniques applied to meat images and assessment of error due to the use of JPEG compression at image capture. JPEG error analysis was performed by capturing TIFF and JPEG images, then calculating the RMS difference and applying a calibration between block boundary features and subjective visual JPEG scores. Both scores indicated high JPEG quality. Correction of JPEG blocking error was trialled and found to produce minimal improvement in the RMS difference. The texture analysis methods used were singular value decomposition over pixel blocks and complex cell analysis. The block singular values were classified as meat or non- meat by Fisher linear discriminant analysis with the colour image processing result used as ‘truth.’ Using receiver operator characteristic (ROC) analysis, an area under the ROC curve of 0.996 was obtained, demonstrating good correspondence between the colour image processing and the singular values. The complex cell analysis indicated a ‘texture angle’ expected from human inspection

    Graph Spectral Image Processing

    Full text link
    Recent advent of graph signal processing (GSP) has spurred intensive studies of signals that live naturally on irregular data kernels described by graphs (e.g., social networks, wireless sensor networks). Though a digital image contains pixels that reside on a regularly sampled 2D grid, if one can design an appropriate underlying graph connecting pixels with weights that reflect the image structure, then one can interpret the image (or image patch) as a signal on a graph, and apply GSP tools for processing and analysis of the signal in graph spectral domain. In this article, we overview recent graph spectral techniques in GSP specifically for image / video processing. The topics covered include image compression, image restoration, image filtering and image segmentation

    CRBLASTER: A Parallel-Processing Computational Framework for Embarrassingly-Parallel Image-Analysis Algorithms

    Full text link
    The development of parallel-processing image-analysis codes is generally a challenging task that requires complicated choreography of interprocessor communications. If, however, the image-analysis algorithm is embarrassingly parallel, then the development of a parallel-processing implementation of that algorithm can be a much easier task to accomplish because, by definition, there is little need for communication between the compute processes. I describe the design, implementation, and performance of a parallel-processing image-analysis application, called CRBLASTER, which does cosmic-ray rejection of CCD (charge-coupled device) images using the embarrassingly-parallel L.A.COSMIC algorithm. CRBLASTER is written in C using the high-performance computing industry standard Message Passing Interface (MPI) library. The code has been designed to be used by research scientists who are familiar with C as a parallel-processing computational framework that enables the easy development of parallel-processing image-analysis programs based on embarrassingly-parallel algorithms. The CRBLASTER source code is freely available at the official application website at the National Optical Astronomy Observatory. Removing cosmic rays from a single 800x800 pixel Hubble Space Telescope WFPC2 image takes 44 seconds with the IRAF script lacos_im.cl running on a single core of an Apple Mac Pro computer with two 2.8-GHz quad-core Intel Xeon processors. CRBLASTER is 7.4 times faster processing the same image on a single core on the same machine. Processing the same image with CRBLASTER simultaneously on all 8 cores of the same machine takes 0.875 seconds -- which is a speedup factor of 50.3 times faster than the IRAF script. A detailed analysis is presented of the performance of CRBLASTER using between 1 and 57 processors on a low-power Tilera 700-MHz 64-core TILE64 processor.Comment: 8 pages, 2 figures, 1 table, accepted for publication in PAS

    Quantitative Assessment of Flame Stability Through Image Processing and Spectral Analysis

    Get PDF
    This paper experimentally investigates two generalized methods, i.e., a simple universal index and oscillation frequency, for the quantitative assessment of flame stability at fossil-fuel-fired furnaces. The index is proposed to assess the stability of flame in terms of its color, geometry, and luminance. It is designed by combining up to seven characteristic parameters extracted from flame images. The oscillation frequency is derived from the spectral analysis of flame radiation signals. The measurements involved in these two methods do not require prior knowledge about fuel property, burner type, and other operation conditions. They can therefore be easily applied to flame stability assessment without costly and complex adaption. Experiments were carried out on a 9-MW heavy-oil-fired combustion test rig over a wide range of combustion conditions including variations in swirl vane position of the tertiary air, swirl vane position of the secondary air, and the ratio of the primary air to the total air. The impact of these burner parameters on the stability of heavy oil flames is investigated by using the index and oscillation frequency proposed. The experimental results obtained demonstrate the effectiveness of the methods and the importance of maintaining a stable flame for reduced NOx emissions. It is envisaged that such methods can be easily transferred to existing flame closed-circuit television systems and flame failure detectors in power stations for flame stability monitoring
    corecore