177 research outputs found

    Detecting Digital Forgeries Using Bispectral Analysis

    Get PDF
    With the rapid increase in low-cost and sophisticated digital technology the need for techniques to authenticate digital material will become more urgent. In this paper we address the problem of authenticating digital signals assuming no explicit prior knowledge of the original. The basic approach that we take is to assume that in the frequency domain a "natural" signal has weak higher-order statistical correlations. We then show that "un-natural" correlations are introduced if this signal is passed through a non-linearity (which would almost surely occur in the creation of a forgery). Techniques from polyspectral analysis are then used to detect the presence of these correlations. We review the basics of polyspectral analysis, show how and why these tools can be used in detecting forgeries and show their effectiveness in analyzing human speech

    A 3-D Photo Forensic Analysis of the Lee Harvey Oswald Backyard Photo

    Get PDF
    More than forty-five years after the assassination of U.S. President Kennedy theories continue to circulate suggesting that the accused assassin, Lee Harvey Oswald, acted as part of a larger conspiracy. It has been argued, for example, that incriminating photographs of Oswald were manipulated, and hence evidence of a broader plot. We describe a detailed 3-D analysis of the Oswald photos to determine if such claims of tampering are warranted

    Digital Image Ballistics from JPEG Quantization: A Followup Study

    Get PDF
    The lossy JPEG compression scheme employs a quantization table that controls the amount of compression achieved. Because different cameras typically employ different tables, a comparison of an image\u27s quantization scheme to a database of known cameras affords a simple technique for confirming or denying an image\u27s source. This report describes the analysis of quantization tables extracted from 1,000,000 images downloaded from Flickr.com

    Digital Image Ballistics from JPEG Quantization

    Get PDF
    Most digital cameras export images in the JPEG file format. This lossy compression scheme employs a quantization table that controls the amount of compression achieved. Different cameras typically employ different tables. A comparison of an image\u27s quantization scheme to a database of known cameras affords a simple technique for confirming or denying an image\u27s source. Similarly, comparison to a database of photo-editing software can be used in a forensic setting to determine if an image was edited after its original recording

    Creating and Detecting Doctored and Virtual Images: Implications to The Child Pornography Prevention Act

    Get PDF
    The 1996 Child Pornography Prevention Act (CPPA) extended the existing federal criminal laws against child pornography to include certain types of virtual porn . In 2002, the United States Supreme Court found that portions of the CPPA, being overly broad and restrictive, violated First Amendment rights. The Court ruled that images containing an actual minor or portions of a minor are not protected, while computer generated images depicting a fictitious computer generated minor are constitutionally protected. In this report I outline various forms of digital tampering, placing them in the context of this recent ruling. I also review computational techniques for detecting doctored and virtual (computer generated) images

    A JPEG Corner Artifact from Directed Rounding of DCT Coefficients

    Get PDF
    JPEG compression introduces a number of well known artifacts including blocking and ringing. We describe a lesser known or understood artifact consisting of a slightly darker or lighter pixel in the corner of 8 x 8 pixel blocks. This artifact is introduced by the directed rounding of DCT coefficients. In particular, we show that DCT coefficients that are uniformly rounded down or up (but not to the nearest neighbor) give rise to this artifact. An analysis of thousands of different camera models reveals that this artifact is present in approximately 61% of cameras. We also propose a simple filtering technique for removing this artifact

    A Statistical Prior for Photo Forensics: Object Removal

    Get PDF
    If we consider photo forensics within a Bayesian framework, then the probability that an image has been manipulated given the results of a forensic test can be expressed as a product of a likelihood term (the probability of a forensic test detecting manipulation given that an image was manipulated) and a prior term (the probability that an image was manipulated). Despite the success of many forensic techniques, the incorporation of a statistical prior has not been previously considered. We describe a framework for incorporating statistical priors into any forensic analysis and specifically address the problem of quantifying the probability that a portion of an image is the result of content-aware fill, cloning, or some other form of information removal. We posit that the incorporation of such a prior will improve the overall accuracy of a broad range of forensic techniques

    The Accuracy, Fairness, and Limits of Predicting Recidivism

    Get PDF
    Algorithms for predicting recidivism are commonly used to assess a criminal defendant’s likelihood of committing a crime. These predictions are used in pretrial, parole, and sentencing decisions. Proponents of these systems argue that big data and advanced machine learning make these analyses more accurate and less biased than humans. We show, however, that the widely used commercial risk assessment software COMPAS is no more accurate or fair than predictions made by people with little or no criminal justice expertise. We further show that a simple linear predictor provided with only two features is nearly equivalent to COMPAS with its 137 features

    Separating Reflections from Images Using Independent Components Analysis

    Get PDF
    The image of an object can vary dramatically depending on lighting, specularities/reflections and shadows. It is often advantageous to separate these incidental variations from the intrinsic aspects of an image. Along these lines this paper describes a method for photographing objects behind glass and digitally removing the reflections off the glass leaving the image of the objects behind the glass intact. We describe the details of this method which employs simple optical techniques and independent components analysis (ICA) and show its efficacy with several examples
    • …
    corecore