3,275 research outputs found
Timeline2GUI: A Log2Timeline CSV Parser and Training Scenarios
Crimes involving digital evidence are getting more complex due to the increasing storage capacities and utilization of devices. Event reconstruction (i.e., understanding the timeline) is an essential step for investigators to understand a case where a prominent tool is Log2Timeline (a tool that creates super timelines which is a combination of several log files and events throughout a system). While these timelines provide great evidence and help to understand a case, they are complex and require tools as well as training scenarios. In this paper we present Timeline2GUI an easy-to-use python implementation to analyze CSV log files create by Log2Timeline. Additionally, we present three training scenarios – beginner, intermediate and advanced – to practice timeline analysis skills as well as familiarity with visualization tools. Lastly, we provide a comprehensive overview of tools
Recommended from our members
Digital forensic analysis of internet history using principal component analysis
A modern Digital Forensic examination, even on a small-scale home computer typically involves searching large-size hard disk drive storage, a variety of host and web-based applications which may or may not be known to the investigator, and a proliferation of web-based Internet history artefacts that may be highly significant to showing the motivation of a suspect. Faster keyword searching and larger and more accurate sets of file hashes may point the investigator to relevant artefacts but when dealing with the new or the unknown, or there is a need to holistically profile the activity of the computer, the investigator is left with a manual and labour-intensive investigation. This paper proposes using an unsupervised statistical learning technique called Principal Component Analysis to provide a novel approach to the analysis of Digital Forensic Internet history. The approach groups and analyses artefacts to produce a high-level context view of the timeline data. The paper proposes a Principal Component Analysis approach and the selection of the appropriate number of Principal Components is described using the Scree test method. A case study of the approach is shown, first using a simulated set of data test comprising of 820 Mozilla Internet History artefacts and then using a set of 5900 Internet Explorer history artefacts from real-world browser data. The results of the analysis are presented in a tabular format that provides an accessible overall view of the activity within the timeline. They show a promising approach to effectively and simply represent large quantities of timeline data at a high-level where basic patterns of usage can be determined. Further work on enhancing the proposed approach to include low-level pattern rules is discussed
Digital Forensics Event Graph Reconstruction
Ontological data representation and data normalization can provide a structured way to correlate digital artifacts. This can reduce the amount of data that a forensics examiner needs to process in order to understand the sequence of events that happened on the system. However, ontology processing suffers from large disk consumption and a high computational cost. This paper presents Property Graph Event Reconstruction (PGER), a novel data normalization and event correlation system that leverages a native graph database to improve the speed of queries common in ontological data. PGER reduces the processing time of event correlation grammars and maintains accuracy over a relational database storage format
The Forensic Curator: Digital Forensics as a Solution to Addressing the Curatorial Challenges Posed by Personal Digital Archives
The growth of computing technology during the previous three decades has resulted in a large amount of content being created in digital form. As their creators retire or pass away, an increasing number of personal data collections, in the form of digital media and complete computer systems, are being offered to the academic institutional archive. For the digital curator or archivist, the handling and processing of such digital material represents a considerable challenge, requiring development of new processes and procedures. This paper outlines how digital forensic methods, developed by the law enforcement and legal community, may be applied by academic digital archives. It goes on to describe the strategic and practical decisions that should be made to introduce forensic methods within an existing curatorial infrastructure and how different techniques, such as forensic hashing, timeline analysis and data carving, may be used to collect information of a greater breadth and scope than may be gathered through manual activities
Automated Artefact Relevancy Determination from Artefact Metadata and Associated Timeline Events
The 2020 IEEE International Conference on Cyber Security And Protection Of Digital Services (Cyber Security 2020), Dublin City University, Ireland (held online due to coronavirus outbreak, 15-17 June 2020Case-hindering, multi-year digital forensic evidence backlogs have become commonplace in law enforcement agencies throughout the world. This is due to an ever-growing number of cases requiring digital forensic investigation coupled with the growing volume of data to be processed per case. Leveraging previously processed digital forensic cases and their component artefact relevancy classifications can facilitate an opportunity for training automated artificial intelligence based evidence processing systems. These can significantly aid investigators in the discovery and prioritisation of evidence. This paper presents one approach for file artefact relevancy determination building on the growing trend towards a centralised, Digital Forensics as a Service (DFaaS) paradigm. This approach enables the use of previously encountered pertinent files to classify newly discovered files in an investigation. Trained models can aid in the detection of these files during the acquisition stage, i.e., during their upload to a DFaaS system. The technique generates a relevancy score for file similarity using each artefact's filesystem metadata and associated timeline events. The approach presented is validated against three experimental usage scenarios
Graph-based Temporal Analysis in Digital Forensics
Establishing a timeline as part of a digital forensics investigation is a vital part of understanding the order in which system events occurred. However, most digital forensics tools present timelines as histogram or as raw artifacts. Consequently, digital forensics examiners are forced to rely on manual, labor-intensive practices to reconstruct system events. Current digital forensics analysis tools are at their technological limit with the increasing storage and complexity of data. A graph-based timeline can present digital forensics evidence in a structure that can be immediately understood and effortlessly focused. This paper presents the Temporal Analysis Integration Management Application (TAIMA) to enhance digital forensics analysis via information visualization (infovis) techniques. TAIMA is a prototype application that provides a graph-based timeline for event reconstruction using abstraction and visualization techniques. A workflow illustration and pilot usability study provided evidence that TAIMA assisted digital forensics specialists in identifying key system events during digital forensics analysis
Methodology for the Automated Metadata-Based Classification of Incriminating Digital Forensic Artefacts
The ever increasing volume of data in digital forensic investigation is one
of the most discussed challenges in the field. Usually, most of the file
artefacts on seized devices are not pertinent to the investigation. Manually
retrieving suspicious files relevant to the investigation is akin to finding a
needle in a haystack. In this paper, a methodology for the automatic
prioritisation of suspicious file artefacts (i.e., file artefacts that are
pertinent to the investigation) is proposed to reduce the manual analysis
effort required. This methodology is designed to work in a human-in-the-loop
fashion. In other words, it predicts/recommends that an artefact is likely to
be suspicious rather than giving the final analysis result. A supervised
machine learning approach is employed, which leverages the recorded results of
previously processed cases. The process of features extraction, dataset
generation, training and evaluation are presented in this paper. In addition, a
toolkit for data extraction from disk images is outlined, which enables this
method to be integrated with the conventional investigation process and work in
an automated fashion
Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations
Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans
Exploring Digital Evidence with Graph Theory
The analysis phase of the digital forensic process is the most complex. The analysis phase remains very subjective to the views of the forensic practitioner. There are many tools dedicated to assisting the investigator during the analysis process. However, they do not address the challenges. Digital forensics is in need of a consistent approach to procure the most judicious conclusions from the digital evidence. The objective of this paper is to discuss the ability of graph theory, a study of related mathematical structures, to aid in the analysis phase of the digital forensic process. We develop a graph-based representation of digital evidence and evaluate the relations between pieces of evidence. We determine possible techniques investigators will be able to use to examine digital evidence, as well as, explore how graph theory can be used as a basis for further analysis. Lastly, we demonstrate the potential of the application of graph theory through its implementation in a case study
IV. Effective Exhibits and Courtroom Technology
Have you perfected your process of trial preparation into an art of war? Are you getting all you can out of the opportunities witnesses and opposing counsel let slip during trial? Do your juries leave the courtroom wanting to reach the verdict that you've clearly stated you want? Join our panel of seasoned trial lawyers for an engaging day of learning and tactics exchange and take your courtroom presentation skills to the next level of excellence
- …