3,052 research outputs found

    GRANEF: Utilization of a Graph Database for Network Forensics

    Get PDF
    Understanding the information in captured network traffic, extracting the necessary data, and performing incident investigations are principal tasks of network forensics. The analysis of such data is typically performed by tools allowing manual browsing, filtering, and aggregation or tools based on statistical analyses and visualizations facilitating data comprehension. However, the human brain is used to perceiving the data in associations, which these tools can provide only in a limited form. We introduce a GRANEF toolkit that demonstrates a new approach to exploratory network data analysis based on associations stored in a graph database. In this article, we describe data transformation principles, utilization of a scalable graph database, and data analysis techniques. We then discuss and evaluate our proposed approach using a realistic dataset. Although we are at the beginning of our research, the current results show the great potential of association-based analysis

    CuFA: A More Formal Definition for Digital Forensic Artifacts

    Get PDF
    The term “artifact” currently does not have a formal definition within the domain of cyber/ digital forensics, resulting in a lack of standardized reporting, linguistic understanding between professionals, and efficiency. In this paper we propose a new definition based on a survey we conducted, literature usage, prior definitions of the word itself, and similarities with archival science. This definition includes required fields that all artifacts must have and encompasses the notion of curation. Thus, we propose using a new term e curated forensic artifact (CuFA) e to address items which have been cleared for entry into a CuFA database (one implementation, the Artifact Genome Project, abbreviated as AGP, is under development and briefly outlined). An ontological model encapsulates these required fields while utilizing a lower-level taxonomic schema. We use the Cyber Observable eXpression (CybOX) project due to its rising popularity and rigorous classifications of forensic objects. Additionally, we suggest some improvements on its integration into our model and identify higher-level location categories to illustrate tracing an object from creation through investigative leads. Finally, a step-wise procedure for researching and logging CuFAs is devised to accompany the model

    A generic database forensic investigation process model

    Get PDF
    Database Forensic investigation is a domain which deals with database contents and their metadata to reveal malicious activities on database systems. Even though it is still new, but due to the overwhelming challenges and issues in the domain, this makes database forensic become a fast growing and much sought after research area. Based on observations made, we found that database forensic suffers from having a common standard which could unify knowledge of the domain. Therefore, through this paper, we present the use of Design Science Research (DSR) as a research methodology to develop a Generic Database Forensic Investigation Process Model (DBFIPM). From the creation of DBFIPM, five common forensic investigation processes have been proposed namely, the i) identification, ii) collection, iii) preservation, iv) analysis and v) presentation process. From the DBFIPM, it allows the reconciliation of concepts and terminologies of all common databases forensic investigation processes. Thus, this will potentially facilitate the sharing of knowledge on database forensic investigation among domain stakeholders

    Making the Invisible Visible – Techniques for Recovering Deleted SQLite Data Records

    Get PDF
    Forensic analysis and evidence collection for web browser activity is a recurring problem in digital investigation. It is not unusual for a suspect to cover his traces. Accordingly, the recovery of previously deleted data such as web cookies and browser history are important. Fortunately, many browsers and thousands of apps used the same database system to store their data: SQLite. Reason enough to take a closer look at this product. In this article, we follow the question of how deleted content can be made visible again in an SQLite-database. For this purpose, the technical background of the problem will be examined first. Techniques are presented with which it is possible to carve and recover deleted data records from a database on a binary level. A novel software solution called FQLite is presented that implements the proposed algorithms. The search quality, as well as the performance of the program, is tested using the standard forensic corpus. The results of a performance study are discussed, as well. The article ends with a summary and identifies further research questions

    Graph-based Temporal Analysis in Digital Forensics

    Get PDF
    Establishing a timeline as part of a digital forensics investigation is a vital part of understanding the order in which system events occurred. However, most digital forensics tools present timelines as histogram or as raw artifacts. Consequently, digital forensics examiners are forced to rely on manual, labor-intensive practices to reconstruct system events. Current digital forensics analysis tools are at their technological limit with the increasing storage and complexity of data. A graph-based timeline can present digital forensics evidence in a structure that can be immediately understood and effortlessly focused. This paper presents the Temporal Analysis Integration Management Application (TAIMA) to enhance digital forensics analysis via information visualization (infovis) techniques. TAIMA is a prototype application that provides a graph-based timeline for event reconstruction using abstraction and visualization techniques. A workflow illustration and pilot usability study provided evidence that TAIMA assisted digital forensics specialists in identifying key system events during digital forensics analysis

    Cyber-security internals of a Skoda Octavia vRS:a hands on approach

    Get PDF
    The convergence of information technology and vehicular technologies are a growing paradigm, allowing information to be sent by and to vehicles. This information can further be processed by the Electronic Control Unit (ECU) and the Controller Area Network (CAN) for in-vehicle communications or through a mobile phone or server for out-vehicle communication. Information sent by or to the vehicle can be life-critical (e.g. breaking, acceleration, cruise control, emergency communication, etc. . . ). As vehicular technology advances, in-vehicle networks are connected to external networks through 3 and 4G mobile networks, enabling manufacturer and customer monitoring of different aspects of the car. While these services provide valuable information, they also increase the attack surface of the vehicle, and can enable long and short range attacks. In this manuscript, we evaluate the security of the 2017 Skoda Octavia vRS 4x4. Both physical and remote attacks are considered, the key fob rolling code is successfully compromised, privacy attacks are demonstrated through the infotainment system, the Volkswagen Transport Protocol 2.0 is reverse engineered. Additionally, in-car attacks are highlighted and described, providing an overlook of potentially deadly threats by modifying ECU parameters and components enabling digital forensics investigation are identified

    Digital Forensic Analysis of Telegram Messenger App in Android Virtual Environment

    Get PDF
    The paper provides an in-depth analysis of the artifacts generated by the Telegram Messenger application on Android OS which provides secure communications between individuals, groups, and channels. Since the past few years, the application went through major changes and updates and the latest version’s artifacts varied from the previous ones. Our methodology is based on the set of experiments designed to generate the artifacts from various use cases on the virtualized environment. The acquired artifacts such as messages, their location, and data structure how they relate to one another were studied and were then compared to the older versions. By correlating the artifacts of newer version with the older ones, it shows how the application have been upgraded behind the scenes and by incorporating those results can provide investigators better understanding and insight for the certain evidence in a potential cybercrime case
    • …
    corecore