6 research outputs found

    Quantifying the need for supervised machine learning in conducting live forensic analysis of emergent configurations (ECO) in IoT environments

    Get PDF
    © 2020 The Author(s) Machine learning has been shown as a promising approach to mine larger datasets, such as those that comprise data from a broad range of Internet of Things devices, across complex environment(s) to solve different problems. This paper surveys existing literature on the potential of using supervised classical machine learning techniques, such as K-Nearest Neigbour, Support Vector Machines, Naive Bayes and Random Forest algorithms, in performing live digital forensics for different IoT configurations. There are also a number of challenges associated with the use of machine learning techniques, as discussed in this paper

    Common investigation process model for internet of things forensics

    Get PDF
    Internet of Things Forensics (IoTFs) is a new discipline in digital forensics science used in the detection, acquisition, preservation, rebuilding, analyzing, and the presentation of evidence from IoT environments. IoTFs discipline still suffers from several issues and challenges that have in the recent past been documented. For example, heterogeneity of IoT infrastructures has mainly been a key challenge. The heterogeneity of the IoT infrastructures makes the IoTFs very complex, and ambiguous among various forensic domain. This paper aims to propose a common investigation processes for IoTFs using the metamodeling method called Common Investigation Process Model (CIPM) for IoTFs. The proposed CIPM consists of four common investigation processes: i) preparation process, ii) collection process, iii) analysis process and iv) final report process. The proposed CIPM can assist IoTFs users to facilitate, manage, and organize the investigation tasks

    Comparative analysis of network forensic tools and network forensics processes

    Get PDF
    Network Forensics (NFs) is a branch of digital forensics which used to detect and capture potential digital crimes over computer networked environments crime. Network Forensic Tools (NFTs) and Network Forensic Processes (NFPs) have abilities to examine networks, collect all normal and abnormal traffic/data, help in network incident analysis, and assist in creating an appropriate incident detection and reaction and also create a forensic hypothesis that can be used in a court of law. Also, it assists in examining the internal incidents and exploitation of assets, attack goals, executes threat evaluation, also by evaluating network performance. According to existing literature, there exist quite a number of NFTs and NTPs that are used for identification, collection, reconstruction, and analysing the chain of incidents that happen on networks. However, they were vary and differ in their roles and functionalities. The main objective of this paper, therefore, is to assess and see the distinction that exist between Network Forensic Tools (NFTs) and Network Forensic Processes (NFPs). Precisely, this paper focuses on comparing among four famous NFTs: Xplico, OmniPeek, NetDetector, and NetIetercept. The outputs of this paper show that the Xplico tool has abilities to identify, collect, reconstruct, and analyse the chain of incidents that happen on networks than other NF tools

    CIPM: Common identification process model for database forensics field

    Get PDF
    Database Forensics (DBF) domain is a branch of digital forensics, concerned with the identification, collection, reconstruction, analysis, and documentation of database crimes. Different researchers have introduced several identification models to handle database crimes. Majority of proposed models are not specific and are redundant, which makes these models a problem because of the multidimensional nature and high diversity of database systems. Accordingly, using the metamodeling approach, the current study is aimed at proposing a unified identification model applicable to the database forensic field. The model integrates and harmonizes all exiting identification processes into a single abstract model, called Common Identification Process Model (CIPM). The model comprises six phases: 1) notifying an incident, 2) responding to the incident, 3) identification of the incident source, 4) verification of the incident, 5) isolation of the database server and 6) provision of an investigation environment. CIMP was found capable of helping the practitioners and newcomers to the forensics domain to control database crimes

    Classification Algorithms and Feature Selection Techniques for a Hybrid Diabetes Detection System

    Get PDF
    Artificial intelligence is a future and valuable tool for early disease recognition and support in patient condition monitoring. It can increase the reliability of the cure and decision making by developing useful systems and algorithms. Healthcare workers, especially nurses and physicians, are overworked due to a massive and unexpected increase in the number of patients during the coronavirus pandemic. In such situations, artificial intelligence techniques could be used to diagnose a patient with life-threatening illnesses. In particular, diseases that increase the risk of hospitalization and death in coronavirus patients, such as high blood pressure, heart disease and diabetes, should be diagnosed at an early stage. This article focuses on diagnosing a diabetic patient through data mining techniques. If we are able to diagnose diabetes in the early stages of the disease, we can force patients to stay home and care for their health, so the risk of being infected with the coronavirus would be reduced. The proposed method has three steps: preprocessing, feature selection and classification. Several combinations of Harmony search algorithm, genetic algorithm, and particle swarm optimization algorithm are examined with K-means for feature selection. The combinations have not examined before for diabetes diagnosis applications. K-nearest neighbor is used for classification of the diabetes dataset. Sensitivity, specificity, and accuracy have been measured to evaluate the results. The results achieved indicate that the proposed method with an accuracy of 91.65% outperformed the results of the earlier methods examined in this article

    Categorization and organization of database forensic investigation processes

    Get PDF
    Database forensic investigation (DBFI) is an important area of research within digital forensics. It's importance is growing as digital data becomes more extensive and commonplace. The challenges associated with DBFI are numerous, and one of the challenges is the lack of a harmonized DBFI process for investigators to follow. In this paper, therefore, we conduct a survey of existing literature with the hope of understanding the body of work already accomplished. Furthermore, we build on the existing literature to present a harmonized DBFI process using design science research methodology. This harmonized DBFI process has been developed based on three key categories (i.e. planning, preparation and pre-response, acquisition and preservation, and analysis and reconstruction). Furthermore, the DBFI has been designed to avoid confusion or ambiguity, as well as providing practitioners with a systematic method of performing DBFI with a higher degree of certainty
    corecore