950,900 research outputs found

    Watchword-Oriented and Time-Stamped Algorithms for Tamper-Proof Cloud Provenance Cognition

    Full text link
    Provenance is derivative journal information about the origin and activities of system data and processes. For a highly dynamic system like the cloud, provenance can be accurately detected and securely used in cloud digital forensic investigation activities. This paper proposes watchword oriented provenance cognition algorithm for the cloud environment. Additionally time-stamp based buffer verifying algorithm is proposed for securing the access to the detected cloud provenance. Performance analysis of the novel algorithms proposed here yields a desirable detection rate of 89.33% and miss rate of 8.66%. The securing algorithm successfully rejects 64% of malicious requests, yielding a cumulative frequency of 21.43 for MR

    Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations

    Full text link
    This work presents a method for the measurement of the accuracy of evidential artifact extraction and categorization tasks in digital forensic investigations. Instead of focusing on the measurement of accuracy and errors in the functions of digital forensic tools, this work proposes the application of information retrieval measurement techniques that allow the incorporation of errors introduced by tools and analysis processes. This method uses a `gold standard' that is the collection of evidential objects determined by a digital investigator from suspect data with an unknown ground truth. This work proposes that the accuracy of tools and investigation processes can be evaluated compared to the derived gold standard using common precision and recall values. Two example case studies are presented showing the measurement of the accuracy of automated analysis tools as compared to an in-depth analysis by an expert. It is shown that such measurement can allow investigators to determine changes in accuracy of their processes over time, and determine if such a change is caused by their tools or knowledge.Comment: 17 pages, 2 appendices, 1 figure, 5th International Conference on Digital Forensics and Cyber Crime; Digital Forensics and Cyber Crime, pp. 147-169, 201

    An Ontology-Based Forensic Analysis Tool

    Get PDF
    The analysis of forensic investigation results has generally been identified as the most complex phase of a digital forensic investigation. This phase becomes more complicated and time consuming as the storage capacity of digital devices is increasing, while at the same time the prices of those devices are decreasing. Although there are some tools and techniques that assist the investigator in the analysis of digital evidence, they do not adequately address some of the serious challenges, particularly with the time and effort required to conduct such tasks. In this paper, we consider the use of semantic web technologies and in particular the ontologies, to assist the investigator in analyzing digital evidence. A novel ontology-based framework is proposed for forensic analysis tools, which we believe has the potential to influence the development of such tools. The framework utilizes a set of ontologies to model the environment under investigation. The evidence extracted from the environment is initially annotated using the Resource Description Framework (RDF). The evidence is then merged from various sources to identify new and implicit information with the help of inference engines and classification mechanisms. In addition, we present the ongoing development of a forensic analysis tool to analyze content retrieved from Android smart phones. For this purpose, several ontologies have been created to model some concepts of the smart phone environment. Keywords: digital forensic investigation, digital forensic analysis tool, semantic web, ontology, androi

    Enhanced Document Clustering using K-Means with Support Vector Machine (SVM) Approach

    Get PDF
    Today’s digital world consists of a large amount of data. Volume of data in the digital world is increasing continuously. Dealing with such important, complex and unstructured data is important. These files consist of data in unstructured text, whose analysis by computer examiners is difficult to be performed. In forensic analysis, experts have to spend a lot of time as well as efforts, to identify criminals and related evidence for investigation. However crime investigation process needs to be faster and efficient. As large amount of information is collected during crime investigation, data mining is an approach which can be useful in this perspective. Data mining is a process that extracts useful information from large amount of crime data so that possible suspects of the crime can be identified efficiently. Algorithms for clustering documents can provide the learning of knowledge from the documents under analysis. This can be done by applying different clustering algorithms to different datasets. Clustering algorithms indeed tends to induce clusters formed by either relevant or irrelevant documents, further extending work by using Clustering Technique Cascaded with Support Vector Machine, thus contributing to enhance the experts job and investigation process can be speed up. DOI: 10.17762/ijritcc2321-8169.150612

    Forensic investigation method and tool based on the user behaviour analysis

    Get PDF
    Today, people use a variety of digital devices, and events taking place in them are stored in specific forms mostly including data indicating when each event took place. So far, different methods have been constantly researched and developed to analyse various events, most of which analyse event data unnecessary for a forensic investigation. As a result, investigators should carry out additional work to select data needed for an actual investigation, making the process of analysis more difficult and longer. Besides, since the capacity of storage media gets higher and events become more diversified, such a phenomenon seems gradually worsened. Thus, this paper suggests a timeline-based method of checking \u27users\u27 behaviour patterns\u27 at a look by analysing, interpreting and visualizing various user behaviour-based events in a short time, since time information exists in digital devices. Moreover, the range of analyses can be widened since investigators can analyse events through computer and smartphone used most out of all the digital devices, not simply through a single system

    Development of National Digital Evidence Metadata

    Get PDF
    The industrial era 4.0 has caused tremendous disruption in many sectors of life. The rapid development of information and communication technology has made the global industrial world undergo a revolution. The act of cyber-crime in Indonesia that utilizes computer equipment, mobile phones are increasingly increasing. The information in a file whose contents are explained about files is called metadata. The evidence items for cyber cases are divided into two types, namely physical evidence, and digital evidence. Physical evidence and digital evidence have different characteristics, the concept will very likely cause problems when applied to digital evidence. The management of national digital evidence that is associated with continued metadata is mostly carried out by researchers. Considering the importance of national digital evidence management solutions in the cyber-crime investigation process the research focused on identifying and modeling correlations with the digital image metadata security approach. Correlation analysis reads metadata characteristics, namely document files, sounds and digital evidence correlation analysis using standard file maker parameters, size, file type and time combined with digital image metadata. nationally designed the highest level of security is needed. Security-enhancing solutions can be encrypted against digital image metadata (EXIF). Read EXIF Metadata in the original digital image based on the EXIF 2.3 Standard ID Tag, then encrypt and insert it into the last line. The description process will return EXIF decryption results in the header image. This can secure EXIF Metadata information without changing the image qualit

    Comprehensive forensic examination with Belkasoft evidence center

    Get PDF
    The enhancement and proliferation of information and communication technology (ICT) has tackled every aspect of human activity: work, leisure, sport, communication, medicine, etc. All around us we can see mobile phones and other connected devices that are now ubiquitous, changing trends in consumer behaviour. Therefore, there is no surprise in fact that such technologies can play a significant role in committing or assisting a crime, since data held on digital devices can give a detailed insight into people’s lives, communications, contacts, friends, family and acquaintances. In order to help law enforcement investigation of such crimes, digital forensic is performed with the aim of collecting crime-related evidence from various digital media and analyse it. Investigators use various forensic techniques to search hidden folders, retrieve deleted data, decrypt the data or restore damaged files, etc. Obtaining evidence such as location data, photos, messages or internet searches can be beneficial, if not crucial, in assisting the police with criminal investigations. Since advances in technologies have led to an increase in the volume, variety, velocity, and veracity of data available for digital forensic analysis, without efficient techniques and tools such investigation would require a tremendous amount of effort and time. That is the reason for expansion in the market of digital forensic tools, both proprietary and free for use, that are available today. In this paper an insight of digital forensic process is given, emphasizing the role of digital forensic tools in providing digital evidence. The possibility of one particular tool, Belkasoft Evidence Center – BEC, in acquisition and analysis of digital evidence was briefly described

    Using Computer Behavior Profiles to Differentiate between Users in a Digital Investigation

    Get PDF
    Most digital crimes involve finding evidence on the computer and then linking it to a suspect using login information, such as a username and a password. However, login information is often shared or compromised. In such a situation, there needs to be a way to identify the user without relying exclusively on login credentials. This paper introduces the concept that users may show behavioral traits which might provide more information about the user on the computer. This hypothesis was tested by conducting an experiment in which subjects were required to perform common tasks on a computer, over multiple sessions. The choices they made to complete each task was recorded. These were converted to a \u27behavior profile,\u27 corresponding to each login session. Cluster Analysis of all the profiles assigned identifiers to each profile such that 98% of profiles were attributed correctly. Also, similarity scores were generated for each session-pair to test whether the similarity analysis attributed profiles to the same user or to two different users. Using similarity scores, the user sessions were correctly attributed 93.2% of the time. Sessions were incorrectly attributed to the same user 3.1% of the time and incorrectly attributed to different users 3.7% of the time. At a confidence level of 95%, the average correct attributions for the population was calculated to be between 92.98% and 93.42%. This shows that users show uniqueness and consistency in the choices they make as they complete everyday tasks on a system, and this can be useful to differentiate between them. Keywords: computer behavior users, interaction, investigation, forensics, graphical inter-face, windows, digital Keywords: computer behavior users, interaction, investigation, forensics, graphical inter- face, windows, digita

    The Venetian Ghetto : Semantic Modelling for an Integrated Analysis

    Get PDF
    In the digital era, historians are embracing information technology as a research tool. New technologies offer investigation and interpretation, synthesis and communication tools that are more effective than the more traditional study methods, as they guarantee a multidisciplinary approach and analyses integration. Among the available technologies the best suited for the study or urban phenomena are databases (DB), the Geographic Information System (GIS), the Building Information Modelling (BIM) and the multimedia tools (Video, APP) for the dissemination of results. The case study described here concerns the analysis of part of Venice that changed its appearance from 1516 onwards, with the creation of the Jewish Ghetto. This was an event that would have repercussions throughout Europe, changing the course of history. Our research confirms that the exclusive use of one of the systems mentioned above (DB, GIS, BIM) makes it possible to manage the complexity of the subject matter only partially. Consequently, it became necessary to analyse the possible interactions between such tools, so as to create a link between an alphanumeric DB and a geographical DB. The use of only GIS and BIM that provide for a 4D time management of objects turned out to be able to manage information and geometry in an effective and scalable way, providing a starting point for the mapping in depth of the historical analysis. Software products for digital modelling have changed in nature over time, going from simple viewing tools to simulation tools. The reconstruction of the time phases of the three Ghettos (Nuovo, Vecchio, and Nuovissimo) and their visualisation through digital narratives of the history of that specific area of the city, for instance through videos, is making it possible for an increasing number of scholars and the general public to access the results of the study
    • …
    corecore