489 research outputs found

    Region-Based Watermarking of Biometric Images: Case Study in Fingerprint Images

    Get PDF
    In this paper, a novel scheme to watermark biometric images is proposed. It exploits the fact that biometric images, normally, have one region of interest, which represents the relevant part of information processable by most of the biometric-based identification/authentication systems. This proposed scheme consists of embedding the watermark into the region of interest only; thus, preserving the hidden data from the segmentation process that removes the useless background and keeps the region of interest unaltered; a process which can be used by an attacker as a cropping attack. Also, it provides more robustness and better imperceptibility of the embedded watermark. The proposed scheme is introduced into the optimum watermark detection in order to improve its performance. It is applied to fingerprint images, one of the most widely used and studied biometric data. The watermarking is assessed in two well-known transform domains: the discrete wavelet transform (DWT) and the discrete Fourier transform (DFT). The results obtained are very attractive and clearly show significant improvements when compared to the standard technique, which operates on the whole image. The results also reveal that the segmentation (cropping) attack does not affect the performance of the proposed technique, which also shows more robustness against other common attacks

    Statistical Modeling of SAR Images: A Survey

    Get PDF
    Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last

    Anisotropic Diffusion Filter with Memory based on Speckle Statistics for Ultrasound Images

    Get PDF
    Ultrasound imaging exhibits considerable difficulties for medical visual inspection and for the development of automatic analysis methods due to speckle, which negatively affects the perception of tissue boundaries and the performance of automatic segmentation methods. With the aim of alleviating the effect of speckle, many filtering techniques are usually considered as a preprocessing step prior to automatic analysis methods or visual inspection. Most of the state-of-the-art filters try to reduce the speckle effect without considering its relevance for the characterization of tissue nature. However, the speckle phenomenon is the inherent response of echo signals in tissues and can provide important features for clinical purposes. This loss of information is even magnified due to the iterative process of some speckle filters, e.g., diffusion filters, which tend to produce over-filtering because of the progressive loss of relevant information for diagnostic purposes during the diffusion process. In this work, we propose an anisotropic diffusion filter with a probabilistic-driven memory mechanism to overcome the over-filtering problem by following a tissue selective philosophy. Specifically, we formulate the memory mechanism as a delay differential equation for the diffusion tensor whose behavior depends on the statistics of the tissues, by accelerating the diffusion process in meaningless regions and including the memory effect in regions where relevant details should be preserved. Results both in synthetic and real US images support the inclusion of the probabilistic memory mechanism for maintaining clinical relevant structures, which are removed by the state-of-the-art filters

    A Tutorial on Speckle Reduction in Synthetic Aperture Radar Images

    Get PDF
    Speckle is a granular disturbance, usually modeled as a multiplicative noise, that affects synthetic aperture radar (SAR) images, as well as all coherent images. Over the last three decades, several methods have been proposed for the reduction of speckle, or despeckling, in SAR images. Goal of this paper is making a comprehensive review of despeckling methods since their birth, over thirty years ago, highlighting trends and changing approaches over years. The concept of fully developed speckle is explained. Drawbacks of homomorphic filtering are pointed out. Assets of multiresolution despeckling, as opposite to spatial-domain despeckling, are highlighted. Also advantages of undecimated, or stationary, wavelet transforms over decimated ones are discussed. Bayesian estimators and probability density function (pdf) models in both spatial and multiresolution domains are reviewed. Scale-space varying pdf models, as opposite to scale varying models, are promoted. Promising methods following non-Bayesian approaches, like nonlocal (NL) filtering and total variation (TV) regularization, are reviewed and compared to spatial- and wavelet-domain Bayesian filters. Both established and new trends for assessment of despeckling are presented. A few experiments on simulated data and real COSMO-SkyMed SAR images highlight, on one side the costperformance tradeoff of the different methods, on the other side the effectiveness of solutions purposely designed for SAR heterogeneity and not fully developed speckle. Eventually, upcoming methods based on new concepts of signal processing, like compressive sensing, are foreseen as a new generation of despeckling, after spatial-domain and multiresolution-domain method

    The Development of Hybrid Process Control Systems For Fluidized Bed Pellet Coating Processes

    Get PDF
    The conventional basic control for pharmaceutical batch processes has several drawbacks. The basic control often uses constant process settings discovered by trial and error. The rigid process operation provides limited process understanding and forgoes the opportunities of process optimization. Product quality attributes are measured by the low efficient off-line tests, therefore these cannot be used to monitor and inform the process to make appropriate adjustments. Frequent reprocessing and batch failures are possible consequences if the process is not under effective control. These issues raise serious concerns of the process capability of a pharmaceutical manufacturing process. An alternative process control strategy is perceived as a logical way to improve the process capability. To demonstrate the strategy, a hybrid control system is proposed in this work. A challenging aqueous drug layering process, which had a batch failure rate of 30% when operated using basic control, was investigated as a model system to develop and demonstrate the hybrid control system. The hybrid control consisted of process manipulation, monitoring and optimization. First principle control was developed to manipulate the process. It used a theory of environmental equivalency to regulate a consistent drying rate for the drug layering process. The process manipulation method successfully eliminated the batch failures previously encountered in the basic control approach. Process monitoring was achieved by building an empirical analytical model using in-line Near-Infrared spectroscopy. The model allowed real time quantitative analysis of drug layered content and was able to determine the endpoint of the process. It achieved quality assurance without relying on the end product tests. Process optimization was accomplished by discovering optimum process settings in an operation space. The operation space was constructed using edge of failure analysis on a design space. It provided setpoints with higher confidence to meet the specifications. The integration of the control elements enabled a complete hybrid control system. The results showed the process capability of the drug layering process was significantly improved by using the hybrid control. The effectiveness was substantiated by statistical evidence of the process capability indices

    Advances in Sonar Technology

    Get PDF
    The demand to explore the largest and also one of the richest parts of our planet, the advances in signal processing promoted by an exponential growth in computation power and a thorough study of sound propagation in the underwater realm, have lead to remarkable advances in sonar technology in the last years.The work on hand is a sum of knowledge of several authors who contributed in various aspects of sonar technology. This book intends to give a broad overview of the advances in sonar technology of the last years that resulted from the research effort of the authors in both sonar systems and their applications. It is intended for scientist and engineers from a variety of backgrounds and even those that never had contact with sonar technology before will find an easy introduction with the topics and principles exposed here
    • …
    corecore