4,278 research outputs found

    Computer Forensic Functions Testing: Media Preparation, Write Protection and Verification

    Get PDF
    The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software) and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemized description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions

    Mobile Data Analysis using Dynamic Binary Instrumentation and Static Analysis

    Get PDF
    Mobile classified data leakage poses a threat to the DoD programs and missions. Security experts must know the format of application data, in order to properly classify mobile applications. This research presents the DBIMAFIA methodology to identify stored data formats. DBIMAFIA uses DBI and static analysis to uncover the structure of mobile application data and validate the results with traditional reverse engineering methods. DBIMAFIA was applied to fifteen popular Android applications and revealed the format of stored data. Notably, user PII leakage is identified in the Rago Games application. The application\u27s messaging service exposes the full name, birthday, and city of any user of the Rago Games application. These findings on how Haga Games uses ObjectBox library to store data in custom file formats can be applied more broadly to any mobile, IoT, or SCADA device or application using the ObjectBox library. Furthermore, the DBIMAFIA methodology can be more broadly defined to identify stored data within any Android application

    The utilization of forensic corpora in validation of data carving on sata drives/

    Get PDF
    The field of digital forensics has become more prevalent in the court of law due to the increase of availability of technology. With digital evidence coming up in court consistently, digital forensics and its tools are coming under scrutiny and being held against disciplines that are more standardized. Validation and Verification of tools is vital to maintaining the integrity of the evidence received by them. Utilizing standardized data sets, or forensic corpora, as a part of validation and verification techniques has shown to be effective. The goal of the study is to assess the use of forensic corpora in the validation and verification of one of the most commonly used digital tools

    A Domain Specific Language for Digital Forensics and Incident Response Analysis

    Get PDF
    One of the longstanding conceptual problems in digital forensics is the dichotomy between the need for verifiable and reproducible forensic investigations, and the lack of practical mechanisms to accomplish them. With nearly four decades of professional digital forensic practice, investigator notes are still the primary source of reproducibility information, and much of it is tied to the functions of specific, often proprietary, tools. The lack of a formal means of specification for digital forensic operations results in three major problems. Specifically, there is a critical lack of: a) standardized and automated means to scientifically verify accuracy of digital forensic tools; b) methods to reliably reproduce forensic computations (their results); and c) framework for inter-operability among forensic tools. Additionally, there is no standardized means for communicating software requirements between users, researchers and developers, resulting in a mismatch in expectations. Combined with the exponential growth in data volume and complexity of applications and systems to be investigated, all of these concerns result in major case backlogs and inherently reduce the reliability of the digital forensic analyses. This work proposes a new approach to the specification of forensic computations, such that the above concerns can be addressed on a scientific basis with a new domain specific language (DSL) called nugget. DSLs are specialized languages that aim to address the concerns of particular domains by providing practical abstractions. Successful DSLs, such as SQL, can transform an application domain by providing a standardized way for users to communicate what they need without specifying how the computation should be performed. This is the first effort to build a DSL for (digital) forensic computations with the following research goals: 1) provide an intuitive formal specification language that covers core types of forensic computations and common data types; 2) provide a mechanism to extend the language that can incorporate arbitrary computations; 3) provide a prototype execution environment that allows the fully automatic execution of the computation; 4) provide a complete, formal, and auditable log of computations that can be used to reproduce an investigation; 5) demonstrate cloud-ready processing that can match the growth in data volumes and complexity

    Current state of validation and testing of digital forensic tools in the United States.

    Get PDF
    The Federal courts' decision in Dauber v. Merrell Dow Pharmaceutical, Inc. (1993) requires forensic testing protocols and tools to be validated and tested for reliability before they can be used to support expert witness testimony. Digital forensic labs and individual examiners in the United States should be performing their own validation and verification tests on their digital forensic tools. The Scientific Working Group of Digital Evidence (SWDGE) recommends that examiners perform validation testing whenever there are new, revised, or reconfigured tools, techniques, or procedures. This study surveyed digital forensics examiners in the U.S. to provide a description of the current state of validation and testing of digital forensic tools, current protocols used for validation, and barriers to performing these tests. The findings included, 95% validate and test their Digital Forensic tools. 80.3% document the validation and testing process and their results. 53.6% validate and test each function if the forensic tool performs several different functions. Examiners should test their digital forensic tools to make sure they are working properly and receiving accurate results. The findings and testimony can be dismissed in court if the examiner is not following set standards.--Abstract

    Forensic Memory Analysis for Apple OS X

    Get PDF
    Analysis of raw memory dumps has become a critical capability in digital forensics because it gives insight into the state of a system that cannot be fully represented through traditional disk analysis. Interest in memory forensics has grown steadily in recent years, with a focus on the Microsoft Windows operating systems. However, similar capabilities for Linux and Apple OS X have lagged by comparison. The volafox open source project has begun work on structured memory analysis for OS X. The tool currently supports a limited set of kernel structures to parse hardware information, system build number, process listing, loaded kernel modules, syscall table, and socket connections. This research addresses one memory analysis deficiency on OS X by introducing a new volafox module for parsing file handles. When open files are mapped to a process, an examiner can learn which resources the process is accessing on disk. This listing is useful for determining what information may have been the target for exfilitration or modification on a compromised system. Comparing output of the developed module and the UNIX lsof (list open files) command on two version of OS X and two kernel architectures validates the methodology used to extract file handle information
    • …
    corecore