5,491 research outputs found

    The Potential for cross-drive analysis using automated digital forensic timelines

    Get PDF
    Cross-Drive Analysis (CDA) is a technique designed to allow an investigator to “simultaneously consider information from across a corpus of many data sources”. Existing approaches include multi-drive correlation using text searching, e.g. email addresses, message IDs, credit card numbers or social security numbers. Such techniques have the potential to identify drives of interest from a large set, provide additional information about events that occurred on a single disk, and potentially determine social network membership. Another analysis technique that has significantly advanced in recent years is the use of timelines. Tools currently exist that can extract dates and times from the file system metadata (i.e. MACE times) and also examine the content of certain file types and extract metadata from within. This approach provides a great deal of data that can assist with an investigation, but also compounds the problem of having too much data to examine. A recent paper adds an additional timeline analysis capability, by automatically producing a high-level summary of the activity on a computer system, by combining sets of low-level events into high-level events, for example reducing a setupapi event and several events from the Windows Registry to a single event of ‘a USB stick was connected’. This paper provides an investigation into the extent to which events in such a high-level timeline have the properties suitable to assist with Cross-Drive Analysis. The paper provides several examples that use timelines generated from multiple disk images, including USB stick connections, Skype calls, and access to files on a memory card

    A framework for the forensic investigation of unstructured email relationship data

    Get PDF
    Our continued reliance on email communications ensures that it remains a major source of evidence during a digital investigation. Emails comprise both structured and unstructured data. Structured data provides qualitative information to the forensics examiner and is typically viewed through existing tools. Unstructured data is more complex as it comprises information associated with social networks, such as relationships within the network, identification of key actors and power relations, and there are currently no standardised tools for its forensic analysis. Moreover, email investigations may involve many hundreds of actors and thousands of messages. This paper posits a framework for the forensic investigation of email data. In particular, it focuses on the triage and analysis of unstructured data to identify key actors and relationships within an email network. This paper demonstrates the applicability of the approach by applying relevant stages of the framework to the Enron email corpus. The paper illustrates the advantage of triaging this data to identify (and discount) actors and potential sources of further evidence. It then applies social network analysis techniques to key actors within the data set. This paper posits that visualisation of unstructured data can greatly aid the examiner in their analysis of evidence discovered during an investigation

    Graph-based Temporal Analysis in Digital Forensics

    Get PDF
    Establishing a timeline as part of a digital forensics investigation is a vital part of understanding the order in which system events occurred. However, most digital forensics tools present timelines as histogram or as raw artifacts. Consequently, digital forensics examiners are forced to rely on manual, labor-intensive practices to reconstruct system events. Current digital forensics analysis tools are at their technological limit with the increasing storage and complexity of data. A graph-based timeline can present digital forensics evidence in a structure that can be immediately understood and effortlessly focused. This paper presents the Temporal Analysis Integration Management Application (TAIMA) to enhance digital forensics analysis via information visualization (infovis) techniques. TAIMA is a prototype application that provides a graph-based timeline for event reconstruction using abstraction and visualization techniques. A workflow illustration and pilot usability study provided evidence that TAIMA assisted digital forensics specialists in identifying key system events during digital forensics analysis

    A Conceptual Cloud Forensic Investigation Process Model for Software as a Service(SaaS) Applications

    Get PDF
    This paper explore a structured and systematic approach cloud forensic investigation process model for SaaS applications, to  investigate the digital crimes in the cloud environment and contributing to enhanced security and privacy of acquired data during forensic investigation .The proposed model offers the distinctive characteristics of cloud environments and the varying levels of access and control within them. In this proposed model, the systematic forensic investigation process is detailed with microscopic details  with four phases namely the initial phase, the acquisition phase, the analysis phase, and the reporting phase in Cloud environment. Ultimately, this research aims to enhance the overall trustworthiness and reliability of SaaS applications forensic for fostering a safer and more secure cloud computing forensic investigation landscape by using the chain of custody

    Calm before the storm: the challenges of cloud computing in digital forensics

    Get PDF
    Cloud computing is a rapidly evolving information technology (IT) phenomenon. Rather than procure, deploy and manage a physical IT infrastructure to host their software applications, organizations are increasingly deploying their infrastructure into remote, virtualized environments, often hosted and managed by third parties. This development has significant implications for digital forensic investigators, equipment vendors, law enforcement, as well as corporate compliance and audit departments (among others). Much of digital forensic practice assumes careful control and management of IT assets (particularly data storage) during the conduct of an investigation. This paper summarises the key aspects of cloud computing and analyses how established digital forensic procedures will be invalidated in this new environment. Several new research challenges addressing this changing context are also identified and discussed

    Facilitating forensic examinations of multi-user computer environments through session-to-session analysis of internet history

    Get PDF
    This paper proposes a new approach to the forensic investigation of Internet history artefacts by aggregating the history from a recovered device into sessions and comparing those sessions to other sessions to determine whether they are one-time events or form a repetitive or habitual pattern. We describe two approaches for performing the session aggregation: fixed-length sessions and variable-length sessions. We also describe an approach for identifying repetitive pattern of life behaviour and show how such patterns can be extracted and represented as binary strings. Using the Jaccard similarity coefficient, a session-to-session comparison can be performed and the sessions can be analysed to determine to what extent a particular session is similar to any other session in the Internet history, and thus is highly likely to correspond to the same user. Experiments have been conducted using two sets of test data, where multiple users have access to the same computer. By identifying patterns of Internet usage that are unique to each user, our approach exhibits a high success rate in attributing particular sessions of the Internet history to the correct user. This can provide considerable help to a forensic investigator trying to establish which user was using the computer when a web-related crime was committed

    Methodology for the Automated Metadata-Based Classification of Incriminating Digital Forensic Artefacts

    Full text link
    The ever increasing volume of data in digital forensic investigation is one of the most discussed challenges in the field. Usually, most of the file artefacts on seized devices are not pertinent to the investigation. Manually retrieving suspicious files relevant to the investigation is akin to finding a needle in a haystack. In this paper, a methodology for the automatic prioritisation of suspicious file artefacts (i.e., file artefacts that are pertinent to the investigation) is proposed to reduce the manual analysis effort required. This methodology is designed to work in a human-in-the-loop fashion. In other words, it predicts/recommends that an artefact is likely to be suspicious rather than giving the final analysis result. A supervised machine learning approach is employed, which leverages the recorded results of previously processed cases. The process of features extraction, dataset generation, training and evaluation are presented in this paper. In addition, a toolkit for data extraction from disk images is outlined, which enables this method to be integrated with the conventional investigation process and work in an automated fashion
    corecore