676 research outputs found

    Image Forgery Localization via Block-Grained Analysis of JPEG Artifacts

    Get PDF
    In this paper, we propose a forensic algorithm to discriminate between original and forged regions in JPEG images, under the hypothesis that the tampered image presents a double JPEG compression, either aligned (A-DJPG) or non-aligned (NA-DJPG). Unlike previous approaches, the proposed algorithm does not need to manually select a suspect region in order to test the presence or the absence of double compression artifacts. Based on an improved and unified statistical model characterizing the artifacts that appear in the presence of both A-DJPG or NA-DJPG, the proposed algorithm automatically computes a likelihood map indicating the probability for each 8×88 \times 8 discrete cosine transform block of being doubly compressed. The validity of the proposed approach has been assessed by evaluating the performance of a detector based on thresholding the likelihood map, considering different forensic scenarios. The effectiveness of the proposed method is also confirmed by tests carried on realistic tampered images. An interesting property of the proposed Bayesian approach is that it can be easily extended to work with traces left by other kinds of processin

    Double-Compressed JPEG Detection in a Steganalysis System

    Get PDF
    The detection of hidden messages in JPEG images is a growing concern. Current detection of JPEG stego images must include detection of double compression: a JPEG image is double compressed if it has been compressed with one quality factor, uncompressed, and then re-compressed with a different quality factor. When detection of double compression is not included, erroneous detection rates are very high. The main contribution of this paper is to present an efficient double-compression detection algorithm that has relatively lower dimensionality of features and relatively lower computational time for the detection part, than current comparative classifiers. We use a model-based approach for creating features, using a subclass of Markov random fields called partially ordered Markov models (POMMs) to modeling the phenomenon of the bit changes that occur in an image after an application of steganography. We model as noise the embedding process, and create features to capture this noise characteristic. We show that the nonparametric conditional probabilities that are modeled using a POMM can work very well to distinguish between an image that has been double compressed and one that has not, with lower overall computational cost. After double compression detection, we analyze histogram patterns that identify the primary quality compression factor to classify the image as stego or cover. The latter is an analytic approach that requires no classifier training. We compare our results with another state-of-the-art double compression detector. Keywords: steganalysis; steganography; JPEG; double compression; digital image forensics

    Detection of Nonaligned Double JPEG Compression Based on Integer Periodicity Maps

    Get PDF
    In this paper, a simple yet reliable algorithm to detect the presence of nonaligned double JPEG compression (NA-JPEG) in compressed images is proposed. The method evaluates a single feature based on the integer periodicity of the blockwise discrete cosine transform (DCT) coefficients when the DCT is computed according to the grid of the previous JPEG compression. Even if the proposed feature is computed relying only on DC coefficient statistics, a simple threshold detector can classify NA-JPEG images with improved accuracy with respect to existing methods and on smaller image sizes, without resorting to a properly trained classifier. Moreover, the proposed scheme is able to accurately estimate the grid shift and the quantization step of the DC coefficient of the primary JPEG compression, allowing one to perform a more detailed analysis of possibly forged image

    First Quantization Estimation by a Robust Data Exploitation Strategy of DCT Coefficients

    Get PDF
    It is well known that the JPEG compression pipeline leaves residual traces in the compressed images that are useful for forensic investigations. Through the analysis of such insights the history of a digital image can be reconstructed by means of First Quantization Estimations (FQE), often employed for the camera model identification (CMI) task. In this paper, a novel FQE technique for JPEG double compressed images is proposed which employs a mixed approach based on Machine Learning and statistical analysis. The proposed method was designed to work in the aligned case (i.e., 8imes88 imes 8 JPEG grid is not misaligned among the various compressions) and demonstrated to be able to work effectively in different challenging scenarios (small input patches, custom quantization tables) without strong a-priori assumptions, surpassing state-of-the-art solutions. Finally, an in-depth analysis on the impact of image input sizes, dataset image resolutions, custom quantization tables and different Discrete Cosine Transform (DCT) implementations was carried out

    Digital image forensics

    Get PDF
    Digital image forensics is a relatively new research field that aims to expose the origin and composition of, and the history of processing applied to digital images. Hence, the digital image forensics is expected to be of significant importance to our modern society in which the digital media are getting more and more popular. In this thesis, image tampering detection and classification of double JPEG compression are the two major subjects studied. Since any manipulation applied to digital images changes image statistics, identifying statistical artifacts becomes critically important in image forensics. In this thesis, a few typical forensic techniques have been studied. Finally, it is foreseen that the investigations on endless confliction between forensics and anti-forensics are to deepen our understanding on image statistics and advance civilization of our society

    Reverse engineering of double compressed images in the presence of contrast enhancement

    Get PDF
    Abstract-A comparison between two forensic techniques for the reverse engineering of a chain composed by a double JPEG compression interleaved by a linear contrast enhancement is presented here. The first approach is based on the well known peak-to-valley behavior of the histogram of double-quantized DCT coefficients, while the second approach is based on the distribution of the first digit of DCT coefficients. These methods have been extended to the study of the considered processing chain, for both the chain detection and the estimation of its parameters. More specifically, the proposed approaches provide an estimation of the quality factor of the previous JPEG compression and the amount of linear contrast enhancement

    Statistical Tools for Digital Image Forensics

    Get PDF
    A digitally altered image, often leaving no visual clues of having been tampered with, can be indistinguishable from an authentic image. The tampering, however, may disturb some underlying statistical properties of the image. Under this assumption, we propose five techniques that quantify and detect statistical perturbations found in different forms of tampered images: (1) re-sampled images (e.g., scaled or rotated); (2) manipulated color filter array interpolated images; (3) double JPEG compressed images; (4) images with duplicated regions; and (5) images with inconsistent noise patterns. These techniques work in the absence of any embedded watermarks or signatures. For each technique we develop the theoretical foundation, show its effectiveness on credible forgeries, and analyze its sensitivity and robustness to simple counter-attacks
    • …
    corecore