4,278 research outputs found
Computer Forensic Functions Testing: Media Preparation, Write Protection and Verification
The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software) and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemized description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions
Mobile Data Analysis using Dynamic Binary Instrumentation and Static Analysis
Mobile classified data leakage poses a threat to the DoD programs and missions. Security experts must know the format of application data, in order to properly classify mobile applications. This research presents the DBIMAFIA methodology to identify stored data formats. DBIMAFIA uses DBI and static analysis to uncover the structure of mobile application data and validate the results with traditional reverse engineering methods. DBIMAFIA was applied to fifteen popular Android applications and revealed the format of stored data. Notably, user PII leakage is identified in the Rago Games application. The application\u27s messaging service exposes the full name, birthday, and city of any user of the Rago Games application. These findings on how Haga Games uses ObjectBox library to store data in custom file formats can be applied more broadly to any mobile, IoT, or SCADA device or application using the ObjectBox library. Furthermore, the DBIMAFIA methodology can be more broadly defined to identify stored data within any Android application
The utilization of forensic corpora in validation of data carving on sata drives/
The field of digital forensics has become more prevalent in the court of law due to the increase of availability of technology. With digital evidence coming up in court consistently, digital forensics and its tools are coming under scrutiny and being held against disciplines that are more standardized. Validation and Verification of tools is vital to maintaining the integrity of the evidence received by them. Utilizing standardized data sets, or forensic corpora, as a part of validation and verification techniques has shown to be effective. The goal of the study is to assess the use of forensic corpora in the validation and verification of one of the most commonly used digital tools
A Domain Specific Language for Digital Forensics and Incident Response Analysis
One of the longstanding conceptual problems in digital forensics is the dichotomy between the need for verifiable and reproducible forensic investigations, and the lack of practical mechanisms to accomplish them. With nearly four decades of professional digital forensic practice, investigator notes are still the primary source of reproducibility information, and much of it is tied to the functions of specific, often proprietary, tools.
The lack of a formal means of specification for digital forensic operations results in three major problems. Specifically, there is a critical lack of:
a) standardized and automated means to scientifically verify accuracy of digital forensic tools;
b) methods to reliably reproduce forensic computations (their results); and
c) framework for inter-operability among forensic tools.
Additionally, there is no standardized means for communicating software requirements between users, researchers and developers, resulting in a mismatch in expectations. Combined with the exponential growth in data volume and complexity of applications and systems to be investigated, all of these concerns result in major case backlogs and inherently reduce the reliability of the digital forensic analyses.
This work proposes a new approach to the specification of forensic computations, such that the above concerns can be addressed on a scientific basis with a new domain specific language (DSL) called nugget. DSLs are specialized languages that aim to address the concerns of particular domains by providing practical abstractions. Successful DSLs, such as SQL, can transform an application domain by providing a standardized way for users to communicate what they need without specifying how the computation should be performed.
This is the first effort to build a DSL for (digital) forensic computations with the following research goals:
1) provide an intuitive formal specification language that covers core types of forensic computations and common data types;
2) provide a mechanism to extend the language that can incorporate arbitrary computations;
3) provide a prototype execution environment that allows the fully automatic execution of the computation;
4) provide a complete, formal, and auditable log of computations that can be used to reproduce an investigation;
5) demonstrate cloud-ready processing that can match the growth in data volumes and complexity
Recommended from our members
Validating digital forensic evidence
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.This dissertation focuses on the forensic validation of computer evidence. It is a
burgeoning field, by necessity, and there have been significant advances in the detection and gathering of evidence related to electronic crimes. What makes the computer
forensics field similar to other forensic fields is that considerable emphasis is placed on the validity of the digital evidence. It is not just the methods used to collect the evidence that is a concern. What is also a problem is that perpetrators of digital crimes may be engaged in what is called anti-forensics. Digital forensic evidence techniques are deliberately thwarted and corrupted by those under investigation. In traditional forensics
the link between evidence and perpetrator's actions is often straightforward: a fingerprint on an object indicates that someone has touched the object. Anti-forensic activity would be the equivalent of having the ability to change the nature of the fingerprint before, or during the investigation, thus making the forensic evidence collected invalid or less
reliable. This thesis reviews the existing security models and digital forensics, paying
particular attention to anti-forensic activity that affects the validity of data collected in the form of digital evidence. This thesis will build on the current models in this field and suggest a tentative first step model to manage and detect possibility of anti-forensic activity. The model is concerned with stopping anti-forensic activity, and thus is not a forensic model in the normal sense, it is what will be called a “meta-forensic” model. A
meta-forensic approach is an approach intended to stop attempts to invalidate digital forensic evidence. This thesis proposes a formal procedure and guides forensic examiners to look at evidence in a meta-forensic way
Current state of validation and testing of digital forensic tools in the United States.
The Federal courts' decision in Dauber v. Merrell Dow Pharmaceutical, Inc. (1993) requires forensic testing protocols and tools to be validated and tested for reliability before they can be used to support expert witness testimony. Digital forensic labs and individual examiners in the United States should be performing their own validation and verification tests on their digital forensic tools. The Scientific Working Group of Digital Evidence (SWDGE) recommends that examiners perform validation testing whenever there are new, revised, or reconfigured tools, techniques, or procedures. This study surveyed digital forensics examiners in the U.S. to provide a description of the current state of validation and testing of digital forensic tools, current protocols used for validation, and barriers to performing these tests. The findings included, 95% validate and test their Digital Forensic tools. 80.3% document the validation and testing process and their results. 53.6% validate and test each function if the forensic tool performs several different functions. Examiners should test their digital forensic tools to make sure they are working properly and receiving accurate results. The findings and testimony can be dismissed in court if the examiner is not following set standards.--Abstract
Forensic Memory Analysis for Apple OS X
Analysis of raw memory dumps has become a critical capability in digital forensics because it gives insight into the state of a system that cannot be fully represented through traditional disk analysis. Interest in memory forensics has grown steadily in recent years, with a focus on the Microsoft Windows operating systems. However, similar capabilities for Linux and Apple OS X have lagged by comparison. The volafox open source project has begun work on structured memory analysis for OS X. The tool currently supports a limited set of kernel structures to parse hardware information, system build number, process listing, loaded kernel modules, syscall table, and socket connections. This research addresses one memory analysis deficiency on OS X by introducing a new volafox module for parsing file handles. When open files are mapped to a process, an examiner can learn which resources the process is accessing on disk. This listing is useful for determining what information may have been the target for exfilitration or modification on a compromised system. Comparing output of the developed module and the UNIX lsof (list open files) command on two version of OS X and two kernel architectures validates the methodology used to extract file handle information
Recommended from our members
BCFL logging: an approach to acquire and preserve admissible digital forensics evidence in cloud ecosystem
Log files are the primary source of recording users, applications and protocols, activities in the cloud ecosystem. Cloud forensic investigators can use log evidence to ascertain when, why and how a cyber adversary or an insider compromised a system by establishing the crime scene and reconstructing how the incident occurred. However, digital evidence acquisition in a cloud ecosystem is complicated and proven difficult, even with modern forensic acquisition toolkit. The multi-tenancy, Geo-location and Service-Level Agreement have added another layer of complexity in acquiring digital log evidence from a cloud ecosystem. In order to mitigate these complexities of evidence acquisition in the cloud ecosystem, we need a framework that can forensically maintain the trustworthiness and integrity of log evidence. In this paper, we design and implement a Blockchain Cloud Forensic Logging (BCFL) framework, using a Design Science Research Methodological (DSRM) approach. BCFL operates primarily in four stages: (1) Process transaction logs using Blockchain distributed ledger technology (DLT). (2) Use a Blockchain smart contract to maintain the integrity of logs and establish a clear chain of custody. (3) Validate all transaction logs. (4) Maintain transaction log immutability. BCFL will also enhance and strengthen compliance with the European Union (EU) General Data Protection Regulation (GDPR). The results from our single case study will demonstrate that BCFL will mitigate the challenges and complexities faced by digital forensics investigators in acquiring admissible digital evidence from the cloud ecosystem. Furthermore, an instantaneous performance monitoring of the proposed Blockchain cloud forensic logging framework was evaluated. BCFL will ensure trustworthiness, integrity, authenticity and non-repudiation of the log evidence in the cloud
- …