381 research outputs found

    Dealing with temporal inconsistency in automated computer forensic profiling

    Get PDF
    Computer profiling is the automated forensic examination of a computer system in order to provide a human investigator with a characterisation of the activities that have taken place on that system. As part of this process, the logical components of the computer system – components such as users, files and applications - are enumerated and the relationships between them discovered and reported. This information is enriched with traces of historical activity drawn from system logs and from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work examines the impact of temporal inconsistency in such information and discusses two types of temporal inconsistency that may arise – inconsistency arising out of the normal errant behaviour of a computer system, and inconsistency arising out of deliberate tampering by a suspect – and techniques for dealing with inconsistencies of the latter kind. We examine the impact of deliberate tampering through experiments conducted with prototype computer profiling software. Based on the results of these experiments, we discuss techniques which can be employed in computer profiling to deal with such temporal inconsistencies

    The Forensic Curator: Digital Forensics as a Solution to Addressing the Curatorial Challenges Posed by Personal Digital Archives

    Get PDF
    The growth of computing technology during the previous three decades has resulted in a large amount of content being created in digital form. As their creators retire or pass away, an increasing number of personal data collections, in the form of digital media and complete computer systems, are being offered to the academic institutional archive. For the digital curator or archivist, the handling and processing of such digital material represents a considerable challenge, requiring development of new processes and procedures. This paper outlines how digital forensic methods, developed by the law enforcement and legal community, may be applied by academic digital archives. It goes on to describe the strategic and practical decisions that should be made to introduce forensic methods within an existing curatorial infrastructure and how different techniques, such as forensic hashing, timeline analysis and data carving, may be used to collect information of a greater breadth and scope than may be gathered through manual activities

    Digital Forensics Event Graph Reconstruction

    Get PDF
    Ontological data representation and data normalization can provide a structured way to correlate digital artifacts. This can reduce the amount of data that a forensics examiner needs to process in order to understand the sequence of events that happened on the system. However, ontology processing suffers from large disk consumption and a high computational cost. This paper presents Property Graph Event Reconstruction (PGER), a novel data normalization and event correlation system that leverages a native graph database to improve the speed of queries common in ontological data. PGER reduces the processing time of event correlation grammars and maintains accuracy over a relational database storage format

    Improving the measurement of system time on remote hosts

    Get PDF
    The tools and techniques of digital forensics are useful in investigating system failures, gathering evidence of illegal activities, and analyzing computer systems after cyber attacks. Constructing an accurate timeline of digital events is essential to forensic analysis, and developing a correlation between a computer’s system time and a standard time such as Coordinated Universal Time (UTC) is key to building such a timeline. In addition to local temporal data, such as file MAC (Modified, Accessed, and Changed/Created) times and event logs, a computer may hold timestamps from other machines, such as email headers, HTTP cookies, and downloaded files. To fully understand the sequence of events on a single computer, investigators need dependable tools for building clock models of all other computers that have contributed to its timestamps. Building clock models involves measuring the system times on remote hosts and correlating them to the time on the local machine. Sending ICMP or IP timestamp requests and analyzing the responses is one way to take this measurement. The Linux program clockdiff utilizes this method, but it is slow and sometimes inaccurate. Using a series of 50 packets, clockdiff consumes an average of 11 seconds in measuring one target. Also, clockdiff assumes that the time difference between the local and target hosts is never greater than 12 hours. When it receives a timestamp showing a greater difference, it manipulates this value without alerting the user, reporting a result that could make the target appear to be more tightly synchronized with the local host than it actually is. Thus, clockdiff is not the best choice for forensic investigators. As a better alternative, we have designed and implemented a program called clockvar, which also uses ICMP and IP timestamp messages. We show by experiment that clockvar maintains precision when system times on the local and target hosts differ by twelve to twenty-four hours, and we demonstrate that clockvar is capable of making measurements up to 1400 times faster than clockdiff

    Digital evidence bags

    Get PDF
    This thesis analyses the traditional approach and methodology used to conduct digital forensic information capture, analysis and investigation. The predominant toolsets and utilities that are used and the features that they provide are reviewed. This is used to highlight the difficulties that are encountered due to both technological advances and the methodologies employed. It is suggested that these difficulties are compounded by the archaic methods and proprietary formats that are used. An alternative framework for the capture and storage of information used in digital forensics is defined named the `Digital Evidence Bag' (DEB). A DEB is a universal extensible container for the storage of digital information acquired from any digital source. The format of which can be manipulated to meet the requirements of the particular information that is to be stored. The format definition is extensible thereby allowing it to encompass new sources of data, cryptographic and compression algorithms and protocols as developed, whilst also providing the flexibility for some degree of backwards compatibility as the format develops. The DEB framework utilises terminology to define its various components that are analogous with evidence bags, tags and seals used for traditional physical evidence storage and continuity. This is crucial for ensuring that the functionality provided by each component is comprehensible by the general public, judiciary and law enforcement personnel without detracting or obscuring the evidential information contained within. Furthermore, information can be acquired from a dynamic or more traditional static environment and from a disparate range of digital devices. The flexibility of the DEB framework permits selective and/or intelligent acquisition methods to be employed together with enhanced provenance and continuity audit trails to be recorded. Evidential integrity is assured using accepted cryptographic techniques and algorithms. The DEB framework is implemented in a number of tool demonstrators and applied to a number of typical scenarios that illustrate the flexibility of the DEB framework and format. The DEB framework has also formed the basis of a patent application

    Decentralized Identity and Access Management Framework for Internet of Things Devices

    Get PDF
    The emerging Internet of Things (IoT) domain is about connecting people and devices and systems together via sensors and actuators, to collect meaningful information from the devices surrounding environment and take actions to enhance productivity and efficiency. The proliferation of IoT devices from around few billion devices today to over 25 billion in the next few years spanning over heterogeneous networks defines a new paradigm shift for many industrial and smart connectivity applications. The existing IoT networks faces a number of operational challenges linked to devices management and the capability of devices’ mutual authentication and authorization. While significant progress has been made in adopting existing connectivity and management frameworks, most of these frameworks are designed to work for unconstrained devices connected in centralized networks. On the other hand, IoT devices are constrained devices with tendency to work and operate in decentralized and peer-to-peer arrangement. This tendency towards peer-to-peer service exchange resulted that many of the existing frameworks fails to address the main challenges faced by the need to offer ownership of devices and the generated data to the actual users. Moreover, the diversified list of devices and offered services impose that more granular access control mechanisms are required to limit the exposure of the devices to external threats and provide finer access control policies under control of the device owner without the need for a middleman. This work addresses these challenges by utilizing the concepts of decentralization introduced in Distributed Ledger (DLT) technologies and capability of automating business flows through smart contracts. The proposed work utilizes the concepts of decentralized identifiers (DIDs) for establishing a decentralized devices identity management framework and exploits Blockchain tokenization through both fungible and non-fungible tokens (NFTs) to build a self-controlled and self-contained access control policy based on capability-based access control model (CapBAC). The defined framework provides a layered approach that builds on identity management as the foundation to enable authentication and authorization processes and establish a mechanism for accounting through the adoption of standardized DLT tokenization structure. The proposed framework is demonstrated through implementing a number of use cases that addresses issues related identity management in industries that suffer losses in billions of dollars due to counterfeiting and lack of global and immutable identity records. The framework extension to support applications for building verifiable data paths in the application layer were addressed through two simple examples. The system has been analyzed in the case of issuing authorization tokens where it is expected that DLT consensus mechanisms will introduce major performance hurdles. A proof of concept emulating establishing concurrent connections to a single device presented no timed-out requests at 200 concurrent connections and a rise in the timed-out requests ratio to 5% at 600 connections. The analysis showed also that a considerable overhead in the data link budget of 10.4% is recorded due to the use of self-contained policy token which is a trade-off between building self-contained access tokens with no middleman and link cost

    CeFF: A Frameword for Forensics Enabled Cloud Investigation

    Get PDF
    Today, cloud computing has developed a transformative model for the organization, business, governments that brings huge potentials and turn into popular for pay as you go, on-demand service, scalability and efficient services. However, cloud computing has made the concern for forensic data because of the architecture of cloud system is not measured appropriately. Due to the distributed nature of the cloud system, many aspects relating to the forensic investigation such as data collection, data storage, crime target, data violation are difficult to achieve. Investigating the incidents in the cloud environment is a challenging task because the forensics investigator still needs to relay on the third party such as cloud service provider for performing their investigation tasks. It makes the overall forensic process difficult to complete with a duration and presented it to the court. Recently, there are some cloud forensics studies to address the challenges such as evidence collection, data acquisition, identifying the incidents and so on. However, still, there is a research gap in terms of consistency of analysing forensic evidence from distributed environment and methodology to analyse the forensic data in the cloud. This thesis contributes towards the direction of addressing the research gaps. In particular, this work proposes a forensic investigation framework CeFF: A framework for forensics enabled cloud investigation to investigate evidence in the cloud computing environment. The framework includes a set of concepts from organisational, technical and legal perspectives, which gives a holistic view of analysing cybercrime from organisation context where the crime has occurred through technical context and legal impact. The CeFF also includes a systematic process that uses the concept for performing the investigation. The cloud-enabled forensics framework meets all the forensics related requirement such as data collection, examination, presents the report, and identifies the potential risks that can consider while investigating the evidence in the cloud-computing environment. Finally, the proposed CeFF is applied to a real-life example to validate its applicability. The result shows that CeFF supports analysing the forensic data for a crime occurred in cloud-based system in a systematic way

    Forensic investigation of small-scale digital devices: a futuristic view

    Get PDF
    Small-scale digital devices like smartphones, smart toys, drones, gaming consoles, tablets, and other personal data assistants have now become ingrained constituents in our daily lives. These devices store massive amounts of data related to individual traits of users, their routine operations, medical histories, and financial information. At the same time, with continuously evolving technology, the diversity in operating systems, client storage localities, remote/cloud storages and backups, and encryption practices renders the forensic analysis task multi-faceted. This makes forensic investigators having to deal with an array of novel challenges. This study reviews the forensic frameworks and procedures used in investigating small-scale digital devices. While highlighting the challenges faced by digital forensics, we explore how cutting-edge technologies like Blockchain, Artificial Intelligence, Machine Learning, and Data Science may play a role in remedying concerns. The review aims to accumulate state-of-the-art and identify a futuristic approach for investigating SSDDs
    • …
    corecore