135 research outputs found

    Streaming Maximum-Minimum Filter Using No More than Three Comparisons per Element

    Get PDF
    The running maximum-minimum (max-min) filter computes the maxima and minima over running windows of size w. This filter has numerous applications in signal processing and time series analysis. We present an easy-to-implement online algorithm requiring no more than 3 comparisons per element, in the worst case. Comparatively, no algorithm is known to compute the running maximum (or minimum) filter in 1.5 comparisons per element, in the worst case. Our algorithm has reduced latency and memory usage.Comment: to appear in Nordic Journal of Computin

    Comparison of Algorithms for Baseline Correction of LIBS Spectra for Quantifying Total Carbon in Brazilian Soils

    Full text link
    LIBS is a promising and versatile technique for multi-element analysis that usually takes less than a minute and requires minimal sample preparation and no reagents. Despite the recent advances in elemental quantification, the LIBS still faces issues regarding the baseline produced by background radiation, which adds non-linear interference to the emission lines. In order to create a calibration model to quantify elements using LIBS spectra, the baseline has to be properly corrected. In this paper, we compared the performance of three filters to remove random noise and five methods to correct the baseline of LIBS spectra for the quantification of total carbon in soil samples. All combinations of filters and methods were tested, and their parameters were optimized to result in the best correlation between the corrected spectra and the carbon content in a training sample set. Then all combinations with the optimized parameters were compared with a separate test sample set. A combination of Savitzky-Golay filter and 4S Peak Filling method resulted in the best correction: Pearson's correlation coefficient of 0.93 with root mean square error of 0.21. The result was better than using a linear regression model with the carbon emission line 193.04 nm (correlation of 0.91 with error of 0.26). The procedure proposed here opens a new possibility to correct the baseline of LIBS spectra and to create multivariate methods based on the a given spectral range.Comment: 13 pages, 5 figure

    Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters

    Get PDF
    Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k×k kernel requires of k2−1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024×1024 images with up to 255×255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding

    Recover Degraded Document images Using Binarization Technique

    Get PDF
    In now a days,whole world is connected through the internet. The different types of data ,we can save,copy and backup in the digital form. But old data which is in the form of traditional paper. This old data plays important role in a major task.Many of the paper data is being degraded due to lack of reason. The front and rear data are mix up together so segmention of text from badly degraded document is very challenging task.To solve this problem by using binarization technique. In this paper ,we propose four binarization technique for recovering degraded document images.we firstly apply contrast inversion mechanism on degraded document images. The contrast map is then converted to grayscale image so as to clearly identify the text stroke from background and foreground pixels.Detected text is further segmented using local threshold method that is estimated based on intensities of detected text stroke edge pixel.Finally applying post processing to improve the quality of degraded document images.This binarization technique is simple,robust and efficient for recovering degraded document images. DOI: 10.17762/ijritcc2321-8169.150612

    Computational strategies for understanding underwater optical image datasets

    Get PDF
    Thesis: Ph. D. in Mechanical and Oceanographic Engineering, Joint Program in Oceanography/Applied Ocean Science and Engineering (Massachusetts Institute of Technology, Department of Mechanical Engineering; and the Woods Hole Oceanographic Institution), 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 117-135).A fundamental problem in autonomous underwater robotics is the high latency between the capture of image data and the time at which operators are able to gain a visual understanding of the survey environment. Typical missions can generate imagery at rates hundreds of times greater than highly compressed images can be transmitted acoustically, delaying that understanding until after the vehicle has been recovered and the data analyzed. While automated classification algorithms can lessen the burden on human annotators after a mission, most are too computationally expensive or lack the robustness to run in situ on a vehicle. Fast algorithms designed for mission-time performance could lessen the latency of understanding by producing low-bandwidth semantic maps of the survey area that can then be telemetered back to operators during a mission. This thesis presents a lightweight framework for processing imagery in real time aboard a robotic vehicle. We begin with a review of pre-processing techniques for correcting illumination and attenuation artifacts in underwater images, presenting our own approach based on multi-sensor fusion and a strong physical model. Next, we construct a novel image pyramid structure that can reduce the complexity necessary to compute features across multiple scales by an order of magnitude and recommend features which are fast to compute and invariant to underwater artifacts. Finally, we implement our framework on real underwater datasets and demonstrate how it can be used to select summary images for the purpose of creating low-bandwidth semantic maps capable of being transmitted acoustically.by Jeffrey W. Kaeli.Ph. D. in Mechanical and Oceanographic Engineerin

    Optimal morphological filter design for fabric defect detection

    Get PDF
    This paper investigates the problem of automated defect detection for textile fabrics and proposes a new optimal morphological filter design method for solving this problem. Gabor Wavelet Network (GWN) is adopted as a major technique to extract the texture features of textile fabrics. An optimal morphological filter can be constructed based on the texture features extracted. In view of this optimal filter, a new semi-supervised segmentation algorithm is then proposed. The performance of the scheme is evaluated by using a variety of homogeneous textile images with different types of common defects. The test results exhibit accurate defect detection with low false alarm, thus confirming the robustness and effectiveness of the proposed scheme. In addition, it can be shown that the algorithm proposed in this paper is suitable for on-line applications. Indeed, the proposed algorithm is a low cost PC based solution to the problem of defect detection for textile fabrics. © 2005 IEEE.published_or_final_versio

    Computationally efficient, one-pass algorithm for morphological filters

    Get PDF
    International audienceMany useful morphological filters are built as long concatenations of erosions and dilations: openings, closings, size distributions, sequential filters, etc. This paper proposes a new algorithm implementing morphological dilation and erosion of functions. It supports rectangular structuring element, runs in linear time w.r.t. the image size and constant time w.r.t. the structuring element size, and has minimal memory usage. It has zero algorithm latency and processes data in stream. These properties are inherited by operators composed by concatenation, and allow their efficient implementation. We show how to compute in one pass an Alternate Sequential Filter (ASF(n)) regardless the number of stages n. This algorithm opens the way to such time-critical applications where the complexity and memory requirements of serial morphological operators represented a bottleneck limiting their usability. (C) 2011 Elsevier Inc. All rights reserved

    Brain extraction using the watershed transform from markers

    Get PDF
    Isolation of the brain from other tissue types in magnetic resonance (MR) images is an important step in many types of neuro-imaging research using both humans and animal subjects. The importance of brain extraction is well appreciated—numerous approaches have been published and the benefits of good extraction methods to subsequent processing are well known. We describe a tool—the marker based watershed scalper (MBWSS)—for isolating the brain in T1-weighted MR images built using filtering and segmentation components from the Insight Toolkit (ITK) framework. The key elements of MBWSS—the watershed transform from markers and aggressive filtering with large kernels—are techniques that have rarely been used in neuroimaging segmentation applications. MBWSS is able to reliably isolate the brain without expensive preprocessing steps, such as registration to an atlas, and is therefore useful as the first stage of processing pipelines. It is an informative example of the level of accuracy achievable without using priors in the form of atlases, shape models or libraries of examples. We validate the MBWSS using a publicly available dataset, a paediatric cohort, an adolescent cohort, intra-surgical scans and demonstrate flexibility of the approach by modifying the method to extract macaque brains

    Efficient 2-D Gray-Scale Dilations and Erosions with Arbitrary Flat Structuring Elements

    Get PDF

    A personal identification biometric system based on back-of-hand vein patterns

    Get PDF
    This report describes research on the use of back-of-hand vein patterns as a means of uniquely identifying people. In particular it describes a prototype biometric system developed by the Australian Institute of Security and Applied Technology (AISAT). This system comprises an infrared cold source, a monochrome CCD camera, a monochrome frame-grabber, a personal computer, and custom image acquisition, processing, registration, and matching software. The image processing algorithms are based on Mathematical Morphology. Registration is performed using rotation and translation with respect to the centroid of the two-dimensional domain of a hand. Vein patterns are stored as medial axis representations. Matching involves comparing a given medial axis pattern against a library of patterns using constrained sequential correlation. The matching is two-fold: a newly acquired signature is matched against a dilated library signature, and then the library signature is matched against the dilated acquired signature; this is necessary because of the positional noise exhibited by the back-of-hand veins. The results of a cross-matching experiment for a sample of 20 adults and more than 100 hand images is detailed. In addition preliminary estimates of the false acceptance rate (FAR) and false rejection rate (FRR) for the prototype system are given. Fuzzy relaxation on an association graph is discussed as an alternative to sequential correlation for the matching of vein signatures. An example is provided (including a C program) illustrating the matching process for a pair of signatures obtained from the same hand. The example demonstrates the ability of the fuzzy relaxation method to deal with segmentation errors
    • 

    corecore