7 research outputs found

    Image data hiding

    Get PDF
    Image data hiding represents a class of processes used to embed data into cover images. Robustness is one of the basic requirements for image data hiding. In the first part of this dissertation, 2D and 3D interleaving techniques associated with error-correction-code (ECC) are proposed to significantly improve the robustness of hidden data against burst errors. In most cases, the cover image cannot be inverted back to the original image after the hidden data are retrieved. In this dissertation, one novel reversible (lossless) data hiding technique is then introduced. This technique is based on the histogram modification, which can embed a large amount of data while keeping a very high visual quality for all images. The performance is hence better than most existing reversible data hiding algorithms. However, most of the existing lossless data hiding algorithms are fragile in the sense that the hidden data cannot be extracted correctly after compression or small alteration. In the last part of this dissertation, we then propose a novel robust lossless data hiding technique based on patchwork idea and spatial domain pixel modification. This technique does not generate annoying salt-pepper noise at all, which is unavoidable in the other existing robust lossless data hiding algorithm. This technique has been successfully applied to many commonly used images, thus demonstrating its generality

    The Emerging JPEG 2000 Security (JPSEC) Standard

    Get PDF

    A digital signature and watermarking based authentication system for JPEG2000 images

    Get PDF
    In this thesis, digital signature based authentication system was introduced, which is able to protect JPEG2000 images in different flavors, including fragile authentication and semi-fragile authentication. The fragile authentication is to protect the image at code-stream level, and the semi-fragile is to protect the image at the content level. The semi-fragile can be further classified into lossy and lossless authentication. With lossless authentication, the original image can be recovered after verification. The lossless authentication and the new image compression standard, JPEG2000 is mainly discussed in this thesis

    Image statistical frameworks for digital image forensics

    Get PDF
    The advances of digital cameras, scanners, printers, image editing tools, smartphones, tablet personal computers as well as high-speed networks have made a digital image a conventional medium for visual information. Creation, duplication, distribution, or tampering of such a medium can be easily done, which calls for the necessity to be able to trace back the authenticity or history of the medium. Digital image forensics is an emerging research area that aims to resolve the imposed problem and has grown in popularity over the past decade. On the other hand, anti-forensics has emerged over the past few years as a relatively new branch of research, aiming at revealing the weakness of the forensic technology. These two sides of research move digital image forensic technologies to the next higher level. Three major contributions are presented in this dissertation as follows. First, an effective multi-resolution image statistical framework for digital image forensics of passive-blind nature is presented in the frequency domain. The image statistical framework is generated by applying Markovian rake transform to image luminance component. Markovian rake transform is the applications of Markov process to difference arrays which are derived from the quantized block discrete cosine transform 2-D arrays with multiple block sizes. The efficacy and universality of the framework is then evaluated in two major applications of digital image forensics: 1) digital image tampering detection; 2) classification of computer graphics and photographic images. Second, a simple yet effective anti-forensic scheme is proposed, capable of obfuscating double JPEG compression artifacts, which may vital information for image forensics, for instance, digital image tampering detection. Shrink-and-zoom (SAZ) attack, the proposed scheme, is simply based on image resizing and bilinear interpolation. The effectiveness of SAZ has been evaluated over two promising double JPEG compression schemes and the outcome reveals that the proposed scheme is effective, especially in the cases that the first quality factor is lower than the second quality factor. Third, an advanced textural image statistical framework in the spatial domain is proposed, utilizing local binary pattern (LBP) schemes to model local image statistics on various kinds of residual images including higher-order ones. The proposed framework can be implemented either in single- or multi-resolution setting depending on the nature of application of interest. The efficacy of the proposed framework is evaluated on two forensic applications: 1) steganalysis with emphasis on HUGO (Highly Undetectable Steganography), an advanced steganographic scheme embedding hidden data in a content-adaptive manner locally into some image regions which are difficult for modeling image statics; 2) image recapture detection (IRD). The outcomes of the evaluations suggest that the proposed framework is effective, not only for detecting local changes which is in line with the nature of HUGO, but also for detecting global difference (the nature of IRD)

    Digitale Wasserzeichenverfahren zur Überprüfung der Echtheit von Bildern

    Get PDF
    Die Dissertation liefert einen Beitrag zur Entwicklung von Wasserzeichensystemen zur manipulationssicheren Überprüfung der Echtheit von Bildern. In den Prozess einer JPEG2000-Bildkompression integriert wird ein an den Bildinhalt angepasstes Wasserzeichen nicht-wahrnehmbar ein­gebettet. Es ist robust gegenüber einer breiten Auswahl erlaubter Bildoperatio­nen, wie Kompression des Bildes, Helligkeits- und Kontrastände­rungen, Filterung, Bildschärfun­g sowie Skalierung der Bildgröße. Die Arbeit beinhaltet zudem umfangreiche Untersuchungen, Erweiterungen und Vergleiche mit Verfahren anderer Autoren

    Digitale Wasserzeichenverfahren zur Überprüfung der Echtheit von Bildern

    Get PDF
    Die Dissertation liefert einen Beitrag zur Entwicklung von Wasserzeichensystemen zur manipulationssicheren Überprüfung der Echtheit von Bildern. In den Prozess einer JPEG2000-Bildkompression integriert wird ein an den Bildinhalt angepasstes Wasserzeichen nicht-wahrnehmbar ein­gebettet. Es ist robust gegenüber einer breiten Auswahl erlaubter Bildoperatio­nen, wie Kompression des Bildes, Helligkeits- und Kontrastände­rungen, Filterung, Bildschärfun­g sowie Skalierung der Bildgröße. Die Arbeit beinhaltet zudem umfangreiche Untersuchungen, Erweiterungen und Vergleiche mit Verfahren anderer Autoren

    A unified authentication framework for JPEG2000

    No full text
    This paper proposes a unified authentication framework for JPEG2000 images, which consists of fragile, lossy and lossless authentication for different applications. The authentication strength can be specified using only one parameter called Lowest Authentication Bit-Rate (LABR), bringing much convenience to users. The lossy and lossless authentication could survive various incidental distortions while being able to allocate malicious attacks. In addition, with lossless authentication, the original image can be recovered after verification if no incidental distortion is introduced. 1
    corecore