15,273 research outputs found

    Statistical Tools for Digital Image Forensics

    Get PDF
    A digitally altered image, often leaving no visual clues of having been tampered with, can be indistinguishable from an authentic image. The tampering, however, may disturb some underlying statistical properties of the image. Under this assumption, we propose five techniques that quantify and detect statistical perturbations found in different forms of tampered images: (1) re-sampled images (e.g., scaled or rotated); (2) manipulated color filter array interpolated images; (3) double JPEG compressed images; (4) images with duplicated regions; and (5) images with inconsistent noise patterns. These techniques work in the absence of any embedded watermarks or signatures. For each technique we develop the theoretical foundation, show its effectiveness on credible forgeries, and analyze its sensitivity and robustness to simple counter-attacks

    Digital Forensics Tool Selection with Multi-armed Bandit Problem

    Get PDF
    Digital forensics investigation is a long and tedious process for an investigator in general. There are many tools that investigators must consider, both proprietary and open source. Forensics investigators must choose the best tool available on the market for their cases to make sure they do not overlook any evidence resides in suspect device within a reasonable time frame. This is however hard decision to make, since learning and testing all available tools make their job only harder. In this project, we define the digital forensics tool selection for a specific investigative task as a multi-armed bandit problem assuming that multiple tools are available for an investigator\u27s use. In addition, we also created set of disk images in order to create a real dataset for experiments. This dataset can be used by digital forensics researchers and tool developers for testing and validation purposes. In this paper, we also simulated multi-armed bandit algorithms to test whether using these algorithms would be more successful than using simple randomization during the tool selection process. Our results show that, bandit based strategies successfully analyzed up to 57% more disk images over 1000 simulations. Finally, we also show that our findings satisfy a high level of statistical confidence. This work will help investigators to spend more time on the analysis of evidence than learning and testing different tools to see which one performs better

    Web Based Cyber Forensics Training For Law Enforcement

    Get PDF
    Training and education are two of the most important aspects within cyber forensics. These topics have been of concern since the inception of the field. Training law enforcement is particularly important to ensure proper execution of the digital forensics process. It is also important because the proliferation of technology in to society continues to grow at an exponential rate. Just as technology is used for good there are those that will choose to use it for criminal gains. It is critical that Law Enforcement have the tools and training in cyber forensics. This research looked to determine if web based training was a feasible platform for cyber forensics training. A group of Indiana State Police Troopers were asked to participate in an online study where they were presented cyber forensics training material. That study showed that there was statistical significance between the treatment groups and the control group. The results from the study showed that web based training is an effective means to train a large group of law enforcement officers

    Image statistical frameworks for digital image forensics

    Get PDF
    The advances of digital cameras, scanners, printers, image editing tools, smartphones, tablet personal computers as well as high-speed networks have made a digital image a conventional medium for visual information. Creation, duplication, distribution, or tampering of such a medium can be easily done, which calls for the necessity to be able to trace back the authenticity or history of the medium. Digital image forensics is an emerging research area that aims to resolve the imposed problem and has grown in popularity over the past decade. On the other hand, anti-forensics has emerged over the past few years as a relatively new branch of research, aiming at revealing the weakness of the forensic technology. These two sides of research move digital image forensic technologies to the next higher level. Three major contributions are presented in this dissertation as follows. First, an effective multi-resolution image statistical framework for digital image forensics of passive-blind nature is presented in the frequency domain. The image statistical framework is generated by applying Markovian rake transform to image luminance component. Markovian rake transform is the applications of Markov process to difference arrays which are derived from the quantized block discrete cosine transform 2-D arrays with multiple block sizes. The efficacy and universality of the framework is then evaluated in two major applications of digital image forensics: 1) digital image tampering detection; 2) classification of computer graphics and photographic images. Second, a simple yet effective anti-forensic scheme is proposed, capable of obfuscating double JPEG compression artifacts, which may vital information for image forensics, for instance, digital image tampering detection. Shrink-and-zoom (SAZ) attack, the proposed scheme, is simply based on image resizing and bilinear interpolation. The effectiveness of SAZ has been evaluated over two promising double JPEG compression schemes and the outcome reveals that the proposed scheme is effective, especially in the cases that the first quality factor is lower than the second quality factor. Third, an advanced textural image statistical framework in the spatial domain is proposed, utilizing local binary pattern (LBP) schemes to model local image statistics on various kinds of residual images including higher-order ones. The proposed framework can be implemented either in single- or multi-resolution setting depending on the nature of application of interest. The efficacy of the proposed framework is evaluated on two forensic applications: 1) steganalysis with emphasis on HUGO (Highly Undetectable Steganography), an advanced steganographic scheme embedding hidden data in a content-adaptive manner locally into some image regions which are difficult for modeling image statics; 2) image recapture detection (IRD). The outcomes of the evaluations suggest that the proposed framework is effective, not only for detecting local changes which is in line with the nature of HUGO, but also for detecting global difference (the nature of IRD)

    A study on the false positive rate of Stegdetect

    Get PDF
    In this paper we analyse Stegdetect, one of the well-known image steganalysis tools, to study its false positive rate. In doing so, we process more than 40,000 images randomly downloaded from the Internet using Google images, together with 25,000 images from the ASIRRA (Animal Species Image Recognition for Restricting Access) public corpus. The aim of this study is to help digital forensic analysts, aiming to study a large number of image files during an investigation, to better understand the capabilities and the limitations of steganalysis tools like Stegdetect. The results obtained show that the rate of false positives generated by Stegdetect depends highly on the chosen sensitivity value, and it is generally quite high. This should support the forensic expert to have better interpretation in their results, and taking the false positive rates into consideration. Additionally, we have provided a detailed statistical analysis for the obtained results to study the difference in detection between selected groups, close groups and different groups of images. This method can be applied to any steganalysis tool, which gives the analyst a better understanding of the detection results, especially when he has no prior information about the false positive rate of the tool

    A framework for the forensic investigation of unstructured email relationship data

    Get PDF
    Our continued reliance on email communications ensures that it remains a major source of evidence during a digital investigation. Emails comprise both structured and unstructured data. Structured data provides qualitative information to the forensics examiner and is typically viewed through existing tools. Unstructured data is more complex as it comprises information associated with social networks, such as relationships within the network, identification of key actors and power relations, and there are currently no standardised tools for its forensic analysis. Moreover, email investigations may involve many hundreds of actors and thousands of messages. This paper posits a framework for the forensic investigation of email data. In particular, it focuses on the triage and analysis of unstructured data to identify key actors and relationships within an email network. This paper demonstrates the applicability of the approach by applying relevant stages of the framework to the Enron email corpus. The paper illustrates the advantage of triaging this data to identify (and discount) actors and potential sources of further evidence. It then applies social network analysis techniques to key actors within the data set. This paper posits that visualisation of unstructured data can greatly aid the examiner in their analysis of evidence discovered during an investigation

    An effective and efficient testing methodology for correctness testing for file recovery tools

    Full text link
    We hereby develop an effective and efficient testing methodology for correctness testing for file recovery tools across different file systems. We assume that the tool tester is familiar with the formats of common file types and has the ability to use the tools correctly. Our methodology first derives a testing plan to minimize the number of runs required to identify the differences in tools with respect to correctness. We also present a case study on correctness testing for file carving tools, which allows us to confirm that the number of necessary testing runs is bounded and our results are statistically sound. <br /
    • …
    corecore