5 research outputs found

    Memory acquisition: A 2-Take approach

    Get PDF
    When more and more people recognize the value of volatile data, live forensics gains more weight in digital forensics. It is often used in parallel with traditional pull-the-plug forensics to provide a more reliable result in forensic examination. One of the core components in live forensics is the collection and analysis of memory volatile data, during which the memory content is acquired for searching of relevant evidential data or investigating various computer processes to unveil the activities being performed by a user. However, this conventional method may have weaknesses because of the volatile nature of memory data and the absence of original data for validation. This may cause implication to the admissibility of memory data at the court of law which requires strict authenticity and reliability of evidence. In this paper, we discuss the impact of various memory acquisition methods and suggest a 2-Take approach which aims to enhance the confidence level of the acquired memory data for legal proceedings. © 2009 IEEE.published_or_final_versionThe 2009 International Workshop on Forensics for Future Generation Communication Environments (F2GC-09) in conjunction with CSA 2009, Jeju Island, Korea, 10-12 December 2009. In Proceedings of CSA, 2009, p. 1-

    Forensic Memory Analysis for Apple OS X

    Get PDF
    Analysis of raw memory dumps has become a critical capability in digital forensics because it gives insight into the state of a system that cannot be fully represented through traditional disk analysis. Interest in memory forensics has grown steadily in recent years, with a focus on the Microsoft Windows operating systems. However, similar capabilities for Linux and Apple OS X have lagged by comparison. The volafox open source project has begun work on structured memory analysis for OS X. The tool currently supports a limited set of kernel structures to parse hardware information, system build number, process listing, loaded kernel modules, syscall table, and socket connections. This research addresses one memory analysis deficiency on OS X by introducing a new volafox module for parsing file handles. When open files are mapped to a process, an examiner can learn which resources the process is accessing on disk. This listing is useful for determining what information may have been the target for exfilitration or modification on a compromised system. Comparing output of the developed module and the UNIX lsof (list open files) command on two version of OS X and two kernel architectures validates the methodology used to extract file handle information

    Consistency Issue on Live Systems Forensics

    No full text
    Volatile data, being vital to digital investigation, have become part of the standard items targeted in the course of live response to a computer system. In traditional computer forensics where investigation is carried out on a dead system (e.g. hard disk), data integrity is the first and foremost issue for digital evidence validity in court. In the context of live system forensics, volatile data are acquired from a running system. Due to the ever-changing and volatile nature, it is impossible to verify the integrity of volatile data. Let alone the integrity issue, a more critical problem - data consistency, is present at the data collected on a live system. In this paper, we address and study the consistency issue on live systems forensics. By examining the memory data on a Unix system, we outline a model to distinguish integral data from inconsistent data in a memory dump.link_to_subscribed_fulltex

    Consistency Issue on Live Systems Forensics

    No full text

    Forensic Live Response: Why an Object May be Evidence in the Court of Law?

    No full text
    Volatile data, being vital to digital investigation, have become part of the standard items targeted in the course of forensic live response to a computer system. In traditional computer forensics where investigation is carried out on a dead system for example, hard disk, data integrity is the first and foremost issue for digital evidence validity in the court of law. In the context of live system forensics, volatile data are acquired from a running system. Due to the ever-changing and volatile nature, it is impossible to verify the integrity of volatile data. Let alone the integrity issue, a more critical problem is the data steadiness, data accuracy and validity of data on the note of proven whether an object found on the volatile memory may be used as evidence in the law court. This digital evidence is related to the data collected on a live system. In this paper, we concentrate on the consistency issue on live systems forensics on the fact that an object may be evidence gathered in the crime scene and can be used as evidence in the court of law. By examining the memory data and the concept of an investigation to determine what is required in an event-based analysis of digital forensics that includes an investigation process model. A physical crime scene data can be used to develop hypotheses and answer questions about an incident or crime. This can be used to argue out an object based evidence of an event
    corecore