33 research outputs found

    Improved ECG watermarking technique using curvelet transform

    Get PDF
    Hiding data in electrocardiogram signals are a big challenge due to the embedded information that can hamper the accuracy of disease detection. On the other hand, hiding data into ECG signals provides more security for, and authenticity of, the patient\u27s data. Some recent studies used non-blind watermarking techniques to embed patient information and data of a patient into ECG signals. However, these techniques are not robust against attacks with noise and show a low performance in terms of parameters such as peak signal to noise ratio (PSNR), normalized correlation (NC), mean square error (MSE), percentage residual difference (PRD), bit error rate (BER), structure similarity index measure (SSIM). In this study, an improved blind ECG-watermarking technique is proposed to embed the information of the patient\u27s data into the ECG signals using curvelet transform. The Euclidean distance between every two curvelet coefficients was computed to cluster the curvelet coefficients and after this, data were embedded into the selected clusters. This was an improvement not only in terms of extracting a hidden message from the watermarked ECG signals, but also robust against image-processing attacks. Performance metrics of SSIM, NC, PSNR and BER were used to measure the superiority of presented work. KL divergence and PRD were also used to reveal data hiding in curvelet coefficients of ECG without disturbing the original signal. The simulation results also demonstrated that the clustering method in the curvelet domain provided the best performance-even when the hidden messages were large size

    Blind Image Watermark Detection Algorithm based on Discrete Shearlet Transform Using Statistical Decision Theory

    Get PDF
    Blind watermarking targets the challenging recovery of the watermark when the host is not available during the detection stage.This paper proposes Discrete Shearlet Transform as a new embedding domain for blind image watermarking. Our novel DST blind watermark detection system uses a nonadditive scheme based on the statistical decision theory. It first computes the probability density function (PDF) of the DST coefficients modelled as a Laplacian distribution. The resulting likelihood ratio is compared with a decision threshold calculated using Neyman-Pearson criterion to minimise the missed detection subject to a fixed false alarm probability. Our method is evaluated in terms of imperceptibility, robustness and payload against different attacks (Gaussian noise, Blurring, Cropping, Compression and Rotation) using 30 standard grayscale images covering different characteristics (smooth, more complex with a lot of edges and high detail textured regions). The proposed method shows greater windowing flexibility with more sensitive to directional and anisotropic features when compared against Discrete Wavelet and Contourlets

    Directional edge and texture representations for image processing

    Get PDF
    An efficient representation for natural images is of fundamental importance in image processing and analysis. The commonly used separable transforms such as wavelets axe not best suited for images due to their inability to exploit directional regularities such as edges and oriented textural patterns; while most of the recently proposed directional schemes cannot represent these two types of features in a unified transform. This thesis focuses on the development of directional representations for images which can capture both edges and textures in a multiresolution manner. The thesis first considers the problem of extracting linear features with the multiresolution Fourier transform (MFT). Based on a previous MFT-based linear feature model, the work extends the extraction method into the situation when the image is corrupted by noise. The problem is tackled by the combination of a "Signal+Noise" frequency model, a refinement stage and a robust classification scheme. As a result, the MFT is able to perform linear feature analysis on noisy images on which previous methods failed. A new set of transforms called the multiscale polar cosine transforms (MPCT) are also proposed in order to represent textures. The MPCT can be regarded as real-valued MFT with similar basis functions of oriented sinusoids. It is shown that the transform can represent textural patches more efficiently than the conventional Fourier basis. With a directional best cosine basis, the MPCT packet (MPCPT) is shown to be an efficient representation for edges and textures, despite its high computational burden. The problem of representing edges and textures in a fixed transform with less complexity is then considered. This is achieved by applying a Gaussian frequency filter, which matches the disperson of the magnitude spectrum, on the local MFT coefficients. This is particularly effective in denoising natural images, due to its ability to preserve both types of feature. Further improvements can be made by employing the information given by the linear feature extraction process in the filter's configuration. The denoising results compare favourably against other state-of-the-art directional representations

    ON SOME COMMON COMPRESSIVE SENSING RECOVERY ALGORITHMS AND APPLICATIONS

    Get PDF
    Compressive Sensing, as an emerging technique in signal processing is reviewed in this paper together with its’ common applications. As an alternative to the traditional signal sampling, Compressive Sensing allows a new acquisition strategy with significantly reduced number of samples needed for accurate signal reconstruction. The basic ideas and motivation behind this approach are provided in the theoretical part of the paper. The commonly used algorithms for missing data reconstruction are presented. The Compressive Sensing applications have gained significant attention leading to an intensive growth of signal processing possibilities. Hence, some of the existing practical applications assuming different types of signals in real-world scenarios are described and analyzed as well

    Discrete Wavelet Transforms

    Get PDF
    The discrete wavelet transform (DWT) algorithms have a firm position in processing of signals in several areas of research and industry. As DWT provides both octave-scale frequency and spatial timing of the analyzed signal, it is constantly used to solve and treat more and more advanced problems. The present book: Discrete Wavelet Transforms: Algorithms and Applications reviews the recent progress in discrete wavelet transform algorithms and applications. The book covers a wide range of methods (e.g. lifting, shift invariance, multi-scale analysis) for constructing DWTs. The book chapters are organized into four major parts. Part I describes the progress in hardware implementations of the DWT algorithms. Applications include multitone modulation for ADSL and equalization techniques, a scalable architecture for FPGA-implementation, lifting based algorithm for VLSI implementation, comparison between DWT and FFT based OFDM and modified SPIHT codec. Part II addresses image processing algorithms such as multiresolution approach for edge detection, low bit rate image compression, low complexity implementation of CQF wavelets and compression of multi-component images. Part III focuses watermaking DWT algorithms. Finally, Part IV describes shift invariant DWTs, DC lossless property, DWT based analysis and estimation of colored noise and an application of the wavelet Galerkin method. The chapters of the present book consist of both tutorial and highly advanced material. Therefore, the book is intended to be a reference text for graduate students and researchers to obtain state-of-the-art knowledge on specific applications

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Development of Some Efficient Lossless and Lossy Hybrid Image Compression Schemes

    Get PDF
    Digital imaging generates a large amount of data which needs to be compressed, without loss of relevant information, to economize storage space and allow speedy data transfer. Though both storage and transmission medium capacities have been continuously increasing over the last two decades, they dont match the present requirement. Many lossless and lossy image compression schemes exist for compression of images in space domain and transform domain. Employing more than one traditional image compression algorithms results in hybrid image compression techniques. Based on the existing schemes, novel hybrid image compression schemes are developed in this doctoral research work, to compress the images effectually maintaining the quality
    corecore