1,615 research outputs found

    Machine Learning Aided Static Malware Analysis: A Survey and Tutorial

    Full text link
    Malware analysis and detection techniques have been evolving during the last decade as a reflection to development of different malware techniques to evade network-based and host-based security protections. The fast growth in variety and number of malware species made it very difficult for forensics investigators to provide an on time response. Therefore, Machine Learning (ML) aided malware analysis became a necessity to automate different aspects of static and dynamic malware investigation. We believe that machine learning aided static analysis can be used as a methodological approach in technical Cyber Threats Intelligence (CTI) rather than resource-consuming dynamic malware analysis that has been thoroughly studied before. In this paper, we address this research gap by conducting an in-depth survey of different machine learning methods for classification of static characteristics of 32-bit malicious Portable Executable (PE32) Windows files and develop taxonomy for better understanding of these techniques. Afterwards, we offer a tutorial on how different machine learning techniques can be utilized in extraction and analysis of a variety of static characteristic of PE binaries and evaluate accuracy and practical generalization of these techniques. Finally, the results of experimental study of all the method using common data was given to demonstrate the accuracy and complexity. This paper may serve as a stepping stone for future researchers in cross-disciplinary field of machine learning aided malware forensics.Comment: 37 Page

    D2WFP: A Novel Protocol for Forensically Identifying, Extracting, and Analysing Deep and Dark Web Browsing Activities

    Full text link
    The use of the un-indexed web, commonly known as the deep web and dark web, to commit or facilitate criminal activity has drastically increased over the past decade. The dark web is an in-famously dangerous place where all kinds of criminal activities take place [1-2], despite advances in web forensics techniques, tools, and methodologies, few studies have formally tackled the dark and deep web forensics and the technical differences in terms of investigative techniques and artefacts identification and extraction. This research proposes a novel and comprehensive protocol to guide and assist digital forensics professionals in investigating crimes committed on or via the deep and dark web, The protocol named D2WFP establishes a new sequential approach for performing investigative activities by observing the order of volatility and implementing a systemic approach covering all browsing related hives and artefacts which ultimately resulted into improv-ing the accuracy and effectiveness. Rigorous quantitative and qualitative research has been conducted by assessing D2WFP following a scientifically-sound and comprehensive process in different scenarios and the obtained results show an apparent increase in the number of artefacts re-covered when adopting D2WFP which outperform any current industry or opensource browsing forensics tools. The second contribution of D2WFP is the robust formulation of artefact correlation and cross-validation within D2WFP which enables digital forensics professionals to better document and structure their analysis of host-based deep and dark web browsing artefacts

    Invesitigation of Malware and Forensic Tools on Internet

    Get PDF
    Malware is an application that is harmful to your forensic information. Basically, malware analyses is the process of analysing the behaviours of malicious code and then create signatures to detect and defend against it.Malware, such as Trojan horse, Worms and Spyware severely threatens the forensic security. This research observed that although malware and its variants may vary a lot from content signatures, they share some behaviour features at a higher level which are more precise in revealing the real intent of malware. This paper investigates the various techniques of malware behaviour extraction and analysis. In addition, we discuss the implications of malware analysis tools for malware detection based on various techniques

    A Comprehensive Analysis of the Role of Artificial Intelligence and Machine Learning in Modern Digital Forensics and Incident Response

    Full text link
    In the dynamic landscape of digital forensics, the integration of Artificial Intelligence (AI) and Machine Learning (ML) stands as a transformative technology, poised to amplify the efficiency and precision of digital forensics investigations. However, the use of ML and AI in digital forensics is still in its nascent stages. As a result, this paper gives a thorough and in-depth analysis that goes beyond a simple survey and review. The goal is to look closely at how AI and ML techniques are used in digital forensics and incident response. This research explores cutting-edge research initiatives that cross domains such as data collection and recovery, the intricate reconstruction of cybercrime timelines, robust big data analysis, pattern recognition, safeguarding the chain of custody, and orchestrating responsive strategies to hacking incidents. This endeavour digs far beneath the surface to unearth the intricate ways AI-driven methodologies are shaping these crucial facets of digital forensics practice. While the promise of AI in digital forensics is evident, the challenges arising from increasing database sizes and evolving criminal tactics necessitate ongoing collaborative research and refinement within the digital forensics profession. This study examines the contributions, limitations, and gaps in the existing research, shedding light on the potential and limitations of AI and ML techniques. By exploring these different research areas, we highlight the critical need for strategic planning, continual research, and development to unlock AI's full potential in digital forensics and incident response. Ultimately, this paper underscores the significance of AI and ML integration in digital forensics, offering insights into their benefits, drawbacks, and broader implications for tackling modern cyber threats

    Rethinking Digital Forensics

    Get PDF
    © IAER 2019In the modern socially-driven, knowledge-based virtual computing environment in which organisations are operating, the current digital forensics tools and practices can no longer meet the need for scientific rigour. There has been an exponential increase in the complexity of the networks with the rise of the Internet of Things, cloud technologies and fog computing altering business operations and models. Adding to the problem are the increased capacity of storage devices and the increased diversity of devices that are attached to networks, operating autonomously. We argue that the laws and standards that have been written, the processes, procedures and tools that are in common use are increasingly not capable of ensuring the requirement for scientific integrity. This paper looks at a number of issues with current practice and discusses measures that can be taken to improve the potential of achieving scientific rigour for digital forensics in the current and developing landscapePeer reviewe

    Automating the harmonisation of heterogeneous data in digital forensics

    Get PDF
    Digital forensics has become an increasingly important tool in the fight against cyber and computer-assisted crime. However, with an increasing range of technologies at people’s disposal, investigators find themselves having to process and analyse many systems (e.g. PC, laptop, tablet, Smartphone) in a single case. Unfortunately, current tools operate within an isolated manner, investigating systems and applications on an individual basis. The heterogeneity of the evidence places time constraints and additional cognitive loads upon the investigator. Examples of heterogeneity include applications such as messaging (e.g. iMessenger, Viber, Snapchat and Whatsapp), web browsers (e.g. Firefox and Chrome) and file systems (e.g. NTFS, FAT, and HFS). Being able to analyse and investigate evidence from across devices and applications based upon categories would enable investigators to query all data at once. This paper proposes a novel algorithm to the merging of datasets through a ‘characterisation and harmonisation’ process. The characterisation process analyses the nature of the metadata and the harmonisation process merges the data. A series of experiments using real-life forensic datasets are conducted to evaluate the algorithm across five different categories of datasets (i.e. messaging, graphical files, file system, Internet history, and emails), each containing data from different applications across difference devices (a total of 22 disparate datasets). The results showed that the algorithm is able to merge all fields successfully, with the exception of some binary-based data found within the messaging datasets (contained within Viber and SMS). The error occurred due to a lack of information for the characterisation process to make a useful determination. However, upon the further analysis it was found the error had a minimal impact on subsequent merged data

    An Automated Approach for Digital Forensic Analysis of Heterogeneous Big Data

    Get PDF
    The major challenges with big data examination and analysis are volume, complex interdependence across content, and heterogeneity. The examination and analysis phases are considered essential to a digital forensics process. However, traditional techniques for the forensic investigation use one or more forensic tools to examine and analyse each resource. In addition, when multiple resources are included in one case, there is an inability to cross-correlate findings which often leads to inefficiencies in processing and identifying evidence. Furthermore, most current forensics tools cannot cope with large volumes of data. This paper develops a novel framework for digital forensic analysis of heterogeneous big data. The framework mainly focuses upon the investigations of three core issues: data volume, heterogeneous data and the investigators cognitive load in understanding the relationships between artefacts. The proposed approach focuses upon the use of metadata to solve the data volume problem, semantic web ontologies to solve the heterogeneous data sources and artificial intelligence models to support the automated identification and correlation of artefacts to reduce the burden placed upon the investigator to understand the nature and relationship of the artefacts
    • …
    corecore