3,275 research outputs found

    Timeline2GUI: A Log2Timeline CSV Parser and Training Scenarios

    Get PDF
    Crimes involving digital evidence are getting more complex due to the increasing storage capacities and utilization of devices. Event reconstruction (i.e., understanding the timeline) is an essential step for investigators to understand a case where a prominent tool is Log2Timeline (a tool that creates super timelines which is a combination of several log files and events throughout a system). While these timelines provide great evidence and help to understand a case, they are complex and require tools as well as training scenarios. In this paper we present Timeline2GUI an easy-to-use python implementation to analyze CSV log files create by Log2Timeline. Additionally, we present three training scenarios – beginner, intermediate and advanced – to practice timeline analysis skills as well as familiarity with visualization tools. Lastly, we provide a comprehensive overview of tools

    Digital Forensics Event Graph Reconstruction

    Get PDF
    Ontological data representation and data normalization can provide a structured way to correlate digital artifacts. This can reduce the amount of data that a forensics examiner needs to process in order to understand the sequence of events that happened on the system. However, ontology processing suffers from large disk consumption and a high computational cost. This paper presents Property Graph Event Reconstruction (PGER), a novel data normalization and event correlation system that leverages a native graph database to improve the speed of queries common in ontological data. PGER reduces the processing time of event correlation grammars and maintains accuracy over a relational database storage format

    The Forensic Curator: Digital Forensics as a Solution to Addressing the Curatorial Challenges Posed by Personal Digital Archives

    Get PDF
    The growth of computing technology during the previous three decades has resulted in a large amount of content being created in digital form. As their creators retire or pass away, an increasing number of personal data collections, in the form of digital media and complete computer systems, are being offered to the academic institutional archive. For the digital curator or archivist, the handling and processing of such digital material represents a considerable challenge, requiring development of new processes and procedures. This paper outlines how digital forensic methods, developed by the law enforcement and legal community, may be applied by academic digital archives. It goes on to describe the strategic and practical decisions that should be made to introduce forensic methods within an existing curatorial infrastructure and how different techniques, such as forensic hashing, timeline analysis and data carving, may be used to collect information of a greater breadth and scope than may be gathered through manual activities

    Automated Artefact Relevancy Determination from Artefact Metadata and Associated Timeline Events

    Get PDF
    The 2020 IEEE International Conference on Cyber Security And Protection Of Digital Services (Cyber Security 2020), Dublin City University, Ireland (held online due to coronavirus outbreak, 15-17 June 2020Case-hindering, multi-year digital forensic evidence backlogs have become commonplace in law enforcement agencies throughout the world. This is due to an ever-growing number of cases requiring digital forensic investigation coupled with the growing volume of data to be processed per case. Leveraging previously processed digital forensic cases and their component artefact relevancy classifications can facilitate an opportunity for training automated artificial intelligence based evidence processing systems. These can significantly aid investigators in the discovery and prioritisation of evidence. This paper presents one approach for file artefact relevancy determination building on the growing trend towards a centralised, Digital Forensics as a Service (DFaaS) paradigm. This approach enables the use of previously encountered pertinent files to classify newly discovered files in an investigation. Trained models can aid in the detection of these files during the acquisition stage, i.e., during their upload to a DFaaS system. The technique generates a relevancy score for file similarity using each artefact's filesystem metadata and associated timeline events. The approach presented is validated against three experimental usage scenarios

    Graph-based Temporal Analysis in Digital Forensics

    Get PDF
    Establishing a timeline as part of a digital forensics investigation is a vital part of understanding the order in which system events occurred. However, most digital forensics tools present timelines as histogram or as raw artifacts. Consequently, digital forensics examiners are forced to rely on manual, labor-intensive practices to reconstruct system events. Current digital forensics analysis tools are at their technological limit with the increasing storage and complexity of data. A graph-based timeline can present digital forensics evidence in a structure that can be immediately understood and effortlessly focused. This paper presents the Temporal Analysis Integration Management Application (TAIMA) to enhance digital forensics analysis via information visualization (infovis) techniques. TAIMA is a prototype application that provides a graph-based timeline for event reconstruction using abstraction and visualization techniques. A workflow illustration and pilot usability study provided evidence that TAIMA assisted digital forensics specialists in identifying key system events during digital forensics analysis

    Methodology for the Automated Metadata-Based Classification of Incriminating Digital Forensic Artefacts

    Full text link
    The ever increasing volume of data in digital forensic investigation is one of the most discussed challenges in the field. Usually, most of the file artefacts on seized devices are not pertinent to the investigation. Manually retrieving suspicious files relevant to the investigation is akin to finding a needle in a haystack. In this paper, a methodology for the automatic prioritisation of suspicious file artefacts (i.e., file artefacts that are pertinent to the investigation) is proposed to reduce the manual analysis effort required. This methodology is designed to work in a human-in-the-loop fashion. In other words, it predicts/recommends that an artefact is likely to be suspicious rather than giving the final analysis result. A supervised machine learning approach is employed, which leverages the recorded results of previously processed cases. The process of features extraction, dataset generation, training and evaluation are presented in this paper. In addition, a toolkit for data extraction from disk images is outlined, which enables this method to be integrated with the conventional investigation process and work in an automated fashion

    Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations

    Get PDF
    Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans

    Exploring Digital Evidence with Graph Theory

    Get PDF
    The analysis phase of the digital forensic process is the most complex. The analysis phase remains very subjective to the views of the forensic practitioner. There are many tools dedicated to assisting the investigator during the analysis process. However, they do not address the challenges. Digital forensics is in need of a consistent approach to procure the most judicious conclusions from the digital evidence. The objective of this paper is to discuss the ability of graph theory, a study of related mathematical structures, to aid in the analysis phase of the digital forensic process. We develop a graph-based representation of digital evidence and evaluate the relations between pieces of evidence. We determine possible techniques investigators will be able to use to examine digital evidence, as well as, explore how graph theory can be used as a basis for further analysis. Lastly, we demonstrate the potential of the application of graph theory through its implementation in a case study

    IV. Effective Exhibits and Courtroom Technology

    Get PDF
    Have you perfected your process of trial preparation into an art of war? Are you getting all you can out of the opportunities witnesses and opposing counsel let slip during trial? Do your juries leave the courtroom wanting to reach the verdict that you've clearly stated you want? Join our panel of seasoned trial lawyers for an engaging day of learning and tactics exchange and take your courtroom presentation skills to the next level of excellence
    corecore