3,730 research outputs found

    OpenForensics:a digital forensics GPU pattern matching approach for the 21st century

    Get PDF
    Pattern matching is a crucial component employed in many digital forensic (DF) analysis techniques, such as file-carving. The capacity of storage available on modern consumer devices has increased substantially in the past century, making pattern matching approaches of current generation DF tools increasingly ineffective in performing timely analyses on data seized in a DF investigation. As pattern matching is a trivally parallelisable problem, general purpose programming on graphic processing units (GPGPU) is a natural fit for this problem. This paper presents a pattern matching framework - OpenForensics - that demonstrates substantial performance improvements from the use of modern parallelisable algorithms and graphic processing units (GPUs) to search for patterns within forensic images and local storage devices

    Using multiple GPUs to accelerate string searching for digital forensic analysis

    Get PDF
    String searching within a large corpus of data is an important component of digital forensic (DF) analysis techniques such as file carving. The continuing increase in capacity of consumer storage devices requires corresponding im-provements to the performance of string searching techniques. As string search-ing is a trivially-parallelisable problem, GPGPU approaches are a natural fit – but previous studies have found that local storage presents an insurmountable performance bottleneck. We show that this need not be the case with modern hardware, and demonstrate substantial performance improvements from the use of single and multiple GPUs when searching for strings within a typical forensic disk image

    Advanced Techniques for Improving the Efficacy of Digital Forensics Investigations

    Get PDF
    Digital forensics is the science concerned with discovering, preserving, and analyzing evidence on digital devices. The intent is to be able to determine what events have taken place, when they occurred, who performed them, and how they were performed. In order for an investigation to be effective, it must exhibit several characteristics. The results produced must be reliable, or else the theory of events based on the results will be flawed. The investigation must be comprehensive, meaning that it must analyze all targets which may contain evidence of forensic interest. Since any investigation must be performed within the constraints of available time, storage, manpower, and computation, investigative techniques must be efficient. Finally, an investigation must provide a coherent view of the events under question using the evidence gathered. Unfortunately the set of currently available tools and techniques used in digital forensic investigations does a poor job of supporting these characteristics. Many tools used contain bugs which generate inaccurate results; there are many types of devices and data for which no analysis techniques exist; most existing tools are woefully inefficient, failing to take advantage of modern hardware; and the task of aggregating data into a coherent picture of events is largely left to the investigator to perform manually. To remedy this situation, we developed a set of techniques to facilitate more effective investigations. To improve reliability, we developed the Forensic Discovery Auditing Module, a mechanism for auditing and enforcing controls on accesses to evidence. To improve comprehensiveness, we developed ramparser, a tool for deep parsing of Linux RAM images, which provides previously inaccessible data on the live state of a machine. To improve efficiency, we developed a set of performance optimizations, and applied them to the Scalpel file carver, creating order of magnitude improvements to processing speed and storage requirements. Last, to facilitate more coherent investigations, we developed the Forensic Automated Coherence Engine, which generates a high-level view of a system from the data generated by low-level forensics tools. Together, these techniques significantly improve the effectiveness of digital forensic investigations conducted using them

    Possibilities of autopsy tool use for forensic purposes

    Get PDF
    The rapid development and widespread use of information technology has brought dramatic changes in all spheres of human activity. At the present time it is difficult to imagine how the world functioned without these technologies. However, despite all the advantages that it brings, information technology has opened various opportunities for misuse. This has caused the development of a new scientific discipline called digital forensics, which deals with the collection, preservation, analysis and presentation of digital evidence. Since digital evidence is very sensitive (easy to delete, modify, etc.), it cannot usually be detected and seen with the classic tools. Therefore, for this purpose, the use of specialized forensic tools is required, that can successfully identify such evidence. There are a number of forensic tools, commercial and non-commercial, which can be found on the market. Some of them are used for each step in the process of digital forensic investigations, and some are multi-functional. When talking about the differences between commercial and non-commercial tools, a frequently asked question is which tools are better, more reliable, faster, more functional, etc. This paper will describe the use of Autopsy, one of the most famous non-commercial forensic tools, and compare its properties with the commercial tool FTK (Forensic Toolkit)

    Web Based Cyber Forensics Training For Law Enforcement

    Get PDF
    Training and education are two of the most important aspects within cyber forensics. These topics have been of concern since the inception of the field. Training law enforcement is particularly important to ensure proper execution of the digital forensics process. It is also important because the proliferation of technology in to society continues to grow at an exponential rate. Just as technology is used for good there are those that will choose to use it for criminal gains. It is critical that Law Enforcement have the tools and training in cyber forensics. This research looked to determine if web based training was a feasible platform for cyber forensics training. A group of Indiana State Police Troopers were asked to participate in an online study where they were presented cyber forensics training material. That study showed that there was statistical significance between the treatment groups and the control group. The results from the study showed that web based training is an effective means to train a large group of law enforcement officers

    Reconstructing the progress of digital forensic evidence examination and analysis

    Get PDF
    Abstract. Many commands and tools are used during the evidence examination and analysis stages of digital forensics. If the need to replicate the exact steps from these stages arises later, doing so without proper documentation can be an arduous task. Thus, this thesis focuses on determining how the story of digital forensics progression could be told. To tell the story, this thesis contributes a three-piece system consisting of an updated version of the data collection tool titled Hardtrace, an Application Programming Interface (API) for summarizing and storing collected data to the cloud, and lastly a visualizer application allowing forensic researchers to visually inspect the steps taken during examination and analysis. To obtain data on digital forensics progression and to test the system, a case study was conducted. The study’s participants had to complete a memory forensics Capture the Flag challenge while using Hardtrace. Collected data from each participant was sent to the cloud API. The system’s ability to reconstruct and detail the progression of participants work was tested by performing visual and statistical analysis on the summarized data. System performance testing was also conducted. The results demonstrated that the presented system was able to detail, through visualization, the steps case study participants took while solving the challenge. Statistical summary analysis provided a large quantity of information on how each participant worked, deepening the understanding gained from just visual analysis. Finally, performance analysis showed that the system is able to summarize and visualize data in seconds. Updates to Hardtrace reduced command execution times significantly, nonetheless, the more system calls a tool or command performs, the more execution time overhead is still added by Hardtrace.Digitaaliforensiikan todisteiden tutkimisen ja analyysin rekonstruointi. Tiivistelmä. Digitaaliforensiikan todisteiden tutkimis- ja analyysivaiheissa käytetään useita komentoja ja työkaluja. Jos nämä vaiheet on myöhemmin toistettava samoin tai tutkijan on kerrottava, miten todisteita käsiteltiin, voi tehtyjen toimenpiteiden muistaminen olla haastavaa ilman kunnollista dokumentaatiota. Täten, tämä tutkielma keskittyy ratkaisemaan miten tutkimis- ja analyysivaiheiden eteneminen voitaisiin kertoa ohjelmallisesti. Tätä tarkoitusta varten tässä tutkielmassa toteutettiin kolmiosainen järjestelmä, jonka osat ovat Hardtrace, ohjelmointirajapinta ja visualisointiohjelma. Hardtrace on jo olemassa oleva datankeräystyökalu, jota tässä työssä päivitetään. Pilveen sijoitetun ohjelmointirajapinnan tehtävä on vastaanottaa ja säilyttää Hardtracen tuottamaa dataa, sekä luoda siitä tiivistelmiä. Visualisointiohjelman avulla forensiikkatutkija pystyy tarkastelemaan visuaalisesti tekemänsä forensiikkatutkimuksen etenemistä. Toteutetun järjestelmän kykyä rekonstruoida digitaaliforensiikan vaiheiden eteneminen testattiin tapaustutkimuksella. Tutkimuksen osallistujat suorittivat muistiforensiikka Capture the Flag -haasteen ja heidän suorituksista kerättiin dataa Hardtracella. Ohjelmointirajapinnan kerätystä datasta tuottamia tiivistelmiä analysoitiin visuaalisesti ja tilastollisesti. Tutkielman tulokset näyttivät, että järjestelmä kykeni kertomaan visualisoinnin keinoin, miten tapaustutkimuksen osallistujat selättivät heille annetun haasteen. Osallistujien suoritusten tilastollinen analyysi tuotti paljon lisätietoa osallistujien toiminnasta. Järjestelmän suorituskyvyn havaittiin olevan hyvä dataa tiivistäessä ja visualisoidessa. Hardtraceen tehdyt päivitykset laskivat komentojen ja työkalujen suoritusaikoja huomattavasti, mutta tästä huolimatta mitä enemmän järjestelmäkutsuja komento tai työkalu käyttää, sitä enemmän Hardtrace suoritusaikaa kasvattaa
    corecore