1,877 research outputs found

    Selecting Keyword Search Terms in Computer Forensics Examinations Using Domain Analysis and Modeling

    Get PDF
    The motivation for computer forensics research includes the increase in crimes that involve the use of computers, the increasing capacity of digital storage media, a shortage of trained computer forensics technicians, and a lack of computer forensics standard practices. The hypothesis of this dissertation is that domain modeling of the computer forensics case environment can serve as a methodology for selecting keyword search terms and planning forensics examinations. This methodology can increase the quality of forensics examinations without significantly increasing the combined effort of planning and executing keyword searches. The contributions of this dissertation include: ? A computer forensics examination planning method that utilizes the analytical strengths and knowledge sharing abilities of domain modeling in artificial intelligence and software engineering, ? A computer forensics examination planning method that provides investigators and analysts with a tool for deriving keyword search terms from a case domain model, and ? The design and execution of experiments that illustrate the utility of the case domain modeling method. Three experiment trials were conducted to evaluate the effectiveness of case domain modeling, and each experiment trial used a distinct computer forensics case scenario: an identity theft case, a burglary and money laundering case, and a threatening email case. Analysis of the experiments supports the hypothesis that case domain modeling results in more evidence found during an examination with more effective keyword searching. Additionally, experimental data indicates that case domain modeling is most useful when the evidence disk has a relatively high occurrence of text-based documents and when vivid case background details are available. A pilot study and a case study were also performed to evaluate the utility of case domain modeling for typical law enforcement investigators. In these studies the subjects used case domain models in a computer forensics service solicitation activity. The results of these studies indicate that typical law enforcement officers have a moderate comprehension of the case domain modeling method and that they recognize a moderate amount of utility in the method. Case study subjects also indicated that the method would be more useful if supported by a semi-automated tool

    A Unified Forensics Analysis Approach to Digital Investigation

    Get PDF
    Digital forensics is now essential in addressing cybercrime and cyber-enabled crime but potentially it can have a role in almost every other type of crime. Given technology's continuous development and prevalence, the widespread adoption of technologies among society and the subsequent digital footprints that exist, the analysis of these technologies can help support investigations. The abundance of interconnected technologies and telecommunication platforms has significantly changed the nature of digital evidence. Subsequently, the nature and characteristics of digital forensic cases involve an enormous volume of data heterogeneity, scattered across multiple evidence sources, technologies, applications, and services. It is indisputable that the outspread and connections between existing technologies have raised the need to integrate, harmonise, unify and correlate evidence across data sources in an automated fashion. Unfortunately, the current state of the art in digital forensics leads to siloed approaches focussed upon specific technologies or support of a particular part of digital investigation. Due to this shortcoming, the digital investigator examines each data source independently, trawls through interconnected data across various sources, and often has to conduct data correlation manually, thus restricting the digital investigator’s ability to answer high-level questions in a timely manner with a low cognitive load. Therefore, this research paper investigates the limitations of the current state of the art in the digital forensics discipline and categorises common investigation crimes with the necessary corresponding digital analyses to define the characteristics of the next-generation approach. Based on these observations, it discusses the future capabilities of the next-generation unified forensics analysis tool (U-FAT), with a workflow example that illustrates data unification, correlation and visualisation processes within the proposed method.</jats:p

    Using a Goal-Driven Approach in the Investigation of a Questioned Contract

    Get PDF
    Part 3: FORENSIC TECHNIQUESInternational audienceThis paper presents a systematic process for describing digital forensic investigations. It focuses on forensic goals and anti-forensic obstacles and their operationalization in terms of human and software actions. The paper also demonstrates how the process can be used to capture the various forensic and anti-forensic aspects of a real-world case involving document forgery

    An Automated Approach for Digital Forensic Analysis of Heterogeneous Big Data

    Get PDF
    The major challenges with big data examination and analysis are volume, complex interdependence across content, and heterogeneity. The examination and analysis phases are considered essential to a digital forensics process. However, traditional techniques for the forensic investigation use one or more forensic tools to examine and analyse each resource. In addition, when multiple resources are included in one case, there is an inability to cross-correlate findings which often leads to inefficiencies in processing and identifying evidence. Furthermore, most current forensics tools cannot cope with large volumes of data. This paper develops a novel framework for digital forensic analysis of heterogeneous big data. The framework mainly focuses upon the investigations of three core issues: data volume, heterogeneous data and the investigators cognitive load in understanding the relationships between artefacts. The proposed approach focuses upon the use of metadata to solve the data volume problem, semantic web ontologies to solve the heterogeneous data sources and artificial intelligence models to support the automated identification and correlation of artefacts to reduce the burden placed upon the investigator to understand the nature and relationship of the artefacts

    From Digital Forensics to Intelligent Forensics

    Get PDF
    In this paper we posit that current investigative techniques—particularly as deployed by law enforcement, are becoming unsuitable for most types of crime investigation. The growth in cybercrime and the complexities of the types of the cybercrime coupled with the limitations in time and resources, both computational and human, in addressing cybercrime put an increasing strain on the ability of digital investigators to apply the processes of digital forensics and digital investigations to obtain timely results. In order to combat the problems, there is a need to enhance the use of the resources available and move beyond the capabilities and constraints of the forensic tools that are in current use. We argue that more intelligent techniques are necessary and should be used proactively. The paper makes the case for the need for such tools and techniques, and investigates and discusses the opportunities afforded by applying principles and procedures of artificial intelligence to digital forensics intelligence and to intelligent forensics and suggests that by applying new techniques to digital investigations there is the opportunity to address the challenges of the larger and more complex domains in which cybercrimes are taking place

    Packet analysis for network forensics: A comprehensive survey

    Get PDF
    Packet analysis is a primary traceback technique in network forensics, which, providing that the packet details captured are sufficiently detailed, can play back even the entire network traffic for a particular point in time. This can be used to find traces of nefarious online behavior, data breaches, unauthorized website access, malware infection, and intrusion attempts, and to reconstruct image files, documents, email attachments, etc. sent over the network. This paper is a comprehensive survey of the utilization of packet analysis, including deep packet inspection, in network forensics, and provides a review of AI-powered packet analysis methods with advanced network traffic classification and pattern identification capabilities. Considering that not all network information can be used in court, the types of digital evidence that might be admissible are detailed. The properties of both hardware appliances and packet analyzer software are reviewed from the perspective of their potential use in network forensics

    Digital Forensics Event Graph Reconstruction

    Get PDF
    Ontological data representation and data normalization can provide a structured way to correlate digital artifacts. This can reduce the amount of data that a forensics examiner needs to process in order to understand the sequence of events that happened on the system. However, ontology processing suffers from large disk consumption and a high computational cost. This paper presents Property Graph Event Reconstruction (PGER), a novel data normalization and event correlation system that leverages a native graph database to improve the speed of queries common in ontological data. PGER reduces the processing time of event correlation grammars and maintains accuracy over a relational database storage format

    The use of Artificial Intelligence in digital forensics:An introduction

    Get PDF
    • …
    corecore