2,029 research outputs found

    A Simple Experiment with Microsoft Office 2010 and Windows 7 Utilizing Digital Forensic Methodology

    Get PDF
    Digital forensic examiners are tasked with retrieving data from digital storage devices, and frequently these examiners are expected to explain the circumstances that led to the data being in its current state. Through written reports or verbal, expert testimony delivered in court, digital forensic examiners are expected to describe whether data have been altered, and if so, then to what extent have data been altered. Addressing these expectations results from opinions digital forensic examiners reach concerning their understanding of electronic storage and retrieval methods. The credibility of these opinions evolves from the scientific basis from which they are drawn using forensic methodology. Digital forensic methodology, being a scientific process, is derived from observations and repeatable findings in controlled environments. Furthermore, scientific research methods have established that causal conclusions can be drawn only when observed in controlled experiments. With this in mind, it seems beneficial that digital forensic examiners have a library of experiments from which they can perform, observe results, and derive conclusions. After having conducted an experiment on a specific topic, a digital forensic examiner will be in a better position to express with confidence the state of the current data and perhaps the conditions that led to its current state. This study provides a simple experiment using the contemporary versions of the most widely used software applications running on the most commonly installed operation system. Here, using the Microsoft Office 2010 applications, a simple Word document, an Excel spreadsheet, a PowerPoint presentation, and an Access database are created and then modified. A forensic analysis is performed to determine the extent in which the changes to the data are identified. The value in this study is not that it yields new forensic analysis techniques, but rather that it illustrates a methodology that other digital forensic examiners can apply to develop experiments representing their specific data challenges

    Front Matter

    Get PDF

    Calm before the storm: the challenges of cloud computing in digital forensics

    Get PDF
    Cloud computing is a rapidly evolving information technology (IT) phenomenon. Rather than procure, deploy and manage a physical IT infrastructure to host their software applications, organizations are increasingly deploying their infrastructure into remote, virtualized environments, often hosted and managed by third parties. This development has significant implications for digital forensic investigators, equipment vendors, law enforcement, as well as corporate compliance and audit departments (among others). Much of digital forensic practice assumes careful control and management of IT assets (particularly data storage) during the conduct of an investigation. This paper summarises the key aspects of cloud computing and analyses how established digital forensic procedures will be invalidated in this new environment. Several new research challenges addressing this changing context are also identified and discussed

    The Electronic Evidence Examination Reporting System by the Example of West Prefecture

    Get PDF
    Magistritöös uuriti erinevaid viise eesmärgiga kiirendada elektroonilise asitõendi sisuanalüüsi uurimist vaatlusprotsessis. Muutes politsei tööd elektrooniliste asitõenditega efektiivsemaks, aitab see säästa ka maksumaksja raha, mis on ühiskonnale kasulik. Praegune probleem seisneb pidevas ajapuuduses elektrooniliste tõendite menetlemise protseduuris ja faktis, et vaatlejate kogutud märkmed on üldjuhul korrapäratud. See töö keskendub praktilistele probleemidele, nagu kuidas parandada elektroonilise asitõendi vaatluse protokolli koostamise kiirust. Töös kasutatakse tehniliste uuringute andmeid, mis koguti Lääne prefektuuris ning mis põhinevad tõelisel tööstatistikal ja Politsei– ja Piirivalveameti loal. Loodud andmekogumisemudel võib tagada arvuti vaatleja töö suurema produktiivsuse. Kvantitatiivne ja kvalitatiivne analüüs näitab, et see on kahtlemata üks võimalus kuidas saab kiirendada elektrooniliste asitõendite vaatlusprotsessi. Osana tööst valmistas autor Microsoft Access rakenduse, eesmärgiga aidata ekspertidel vaatlusandmeid koguda. Magistritööst on võimalik järeldada, et andmebaasi kasutamine organiseerimaks vaatlejate märkmeid tõstab nende produktiivsust. Koostatud rakendust saab selle magistritöö tulemusel kasutada elektroonilise asitõendi menetlusprotsessis.The master's thesis examined ways to speed up the examination of electronic evidence content analysis inside the examination process. Making police work with electronic evidence more effective to helping them save time can save taxpayers money and this is good for society. The problem is a constant lack of time during the examination of electronic evidence processing procedure and currently the fact that notes are collected disorderly by the examiners. This work will focus on practical issues like how to improve the speed of drawing up an electronic evidence examination protocol. The work was done basing on examination data results that collected in the West prefecture based on real work statistics and permission by the Police and Border Guard Board. Invented data collecting model can lead examiner to be more productive. Quantitative and qualitative analysis imply that this is definitely one way to speed up electronic evidence examination. As part of the work, the practical Microsoft Access application was developed by the author. We can conclude from the thesis that using the database in order to organize the examiners’ notes will make the examiners’ work faster and more productively. Designed solution can be used in electronic evidence examinations as a result of this thesis

    Back Matter

    Get PDF

    Table of Contents

    Get PDF

    Investigation of IndexedDB Persistent Storage for Digital Forensics

    Get PDF
    The dependency on electronic services is increasing at a rapid rate in every aspect of our daily lives. While the Covid-19 virus remolded how we conduct business through remote collaboration applications, social media is rooting its grasp more in our day-in and day-out activities. Every day, a substantial amount of data is left in both desktop and web-based applications. As the size and the sophistication of stored data increases, so does the complexity of the technology that handles it. Consequently, forensic investigators are facing challenges in constantly adapting to emerging technologies. Hence, these technologies constitute the base for handling the vast size and volume of data in the modern era of information technology. In the scope of this dissertation the efficacy of emerging client-side technology, namely IndexedDB, is scrutinized for forensic value, practices of extraction, processing, presentation, and verification. Accordingly, a series of single case pretest-posttest quasi experiments are conducted to populate artifacts in the underlying storage technologies of IndexedDB. Subsequently, the populated artifacts are extracted and processed based on signature patterns and evaluated for their significance. Additionally, the artifacts are characterized, verified, and presented with the help of cornerstone tools that are implemented in this scope. Furthermore, time-frame analysis is constructed where it is possible to display ordered sequences of events for investigators in a suitable format

    Whitelisting System State In Windows Forensic Memory Visualizations

    Get PDF
    Examiners in the field of digital forensics regularly encounter enormous amounts of data and must identify the few artifacts of evidentiary value. The most pressing challenge these examiners face is manual reconstruction of complex datasets with both hierarchical and associative relationships. The complexity of this data requires significant knowledge, training, and experience to correctly and efficiently examine. Current methods provide primarily text-based representations or low-level visualizations, but levee the task of maintaining global context of system state on the examiner. This research presents a visualization tool that improves analysis methods through simultaneous representation of the hierarchical and associative relationships and local detailed data within a single page application. A novel whitelisting feature further improves analysis by eliminating items of little interest from view, allowing examiners to identify artifacts more quickly and accurately. Results from two pilot studies demonstrates that the visualization tool can assist examiners to more accurately and quickly identify artifacts of interest

    Defensive Cyber Battle Damage Assessment Through Attack Methodology Modeling

    Get PDF
    Due to the growing sophisticated capabilities of advanced persistent cyber threats, it is necessary to understand and accurately assess cyber attack damage to digital assets. This thesis proposes a Defensive Cyber Battle Damage Assessment (DCBDA) process which utilizes the comprehensive understanding of all possible cyber attack methodologies captured in a Cyber Attack Methodology Exhaustive List (CAMEL). This research proposes CAMEL to provide detailed knowledge of cyber attack actions, methods, capabilities, forensic evidence and evidence collection methods. This product is modeled as an attack tree called the Cyber Attack Methodology Attack Tree (CAMAT). The proposed DCBDA process uses CAMAT to analyze potential attack scenarios used by an attacker. These scenarios are utilized to identify the associated digital forensic methods in CAMEL to correctly collect and analyze the damage from a cyber attack. The results from the experimentation of the proposed DCBDA process show the process can be successfully applied to cyber attack scenarios to correctly assess the extent, method and damage caused by a cyber attack

    Forensic Research on Solid State Drives using Trim Analysis

    Get PDF
    There has been a tremendous change in the way we store data for the past decade. Hard Disk Drives, which were one of the major sources of storing data, are being replaced with Solid State Drives considering the higher efficiency and portability. Digital forensics has been very successful in recovering data from Hard Disk Drives in the past couple of years and has been very well established with Hard Disk Drives. The evolution of Solid State Drives over Hard Drive Drives is posing a lot of challenges to digital forensics as there are many crucial factors to be considering the architecture and the way data is stored in Solid State Drives. This paper gives a very detailed picture of the evolution of Solid State Drives over Hard Disk Drives. We understand the differences in their architecture and the ways to extract data from them. We further discuss in detail the various challenges Solid State Drives pose to the field of digital forensics, and we try to answer contradictory beliefs those are 1) Would data be permanently deleted in a Solid State Drives destroying the forensic evidence required to solve a case? 2) Can data be restored in a Solid State Drives by using proper techniques and still can be used as evidence in digital forensics? In this paper, we talk about the introduction of concepts such as the TRIM Command and Garbage collection, their importance, and we set up an experimental scenario where we implement the TRIM command and try extracting data from different types of Solid State Drives. We compare and evaluate the results obtained through the experiment and try to analyze the uses of the TRIM command and its performance over various Solid State Drives. The paper also discusses future work to make the role of Solid State Drives more efficient in digital forensics
    corecore