1,337 research outputs found

    A user-oriented network forensic analyser: the design of a high-level protocol analyser

    Get PDF
    Network forensics is becoming an increasingly important tool in the investigation of cyber and computer-assisted crimes. Unfortunately, whilst much effort has been undertaken in developing computer forensic file system analysers (e.g. Encase and FTK), such focus has not been given to Network Forensic Analysis Tools (NFATs). The single biggest barrier to effective NFATs is the handling of large volumes of low-level traffic and being able to exact and interpret forensic artefacts and their context – for example, being able extract and render application-level objects (such as emails, web pages and documents) from the low-level TCP/IP traffic but also understand how these applications/artefacts are being used. Whilst some studies and tools are beginning to achieve object extraction, results to date are limited to basic objects. No research has focused upon analysing network traffic to understand the nature of its use – not simply looking at the fact a person requested a webpage, but how long they spend on the application and what interactions did they have with whilst using the service (e.g. posting an image, or engaging in an instant message chat). This additional layer of information can provide an investigator with a far more rich and complete understanding of a suspect’s activities. To this end, this paper presents an investigation into the ability to derive high-level application usage characteristics from low-level network traffic meta-data. The paper presents a three application scenarios – web surfing, communications and social networking and demonstrates it is possible to derive the user interactions (e.g. page loading, chatting and file sharing ) within these systems. The paper continues to present a framework that builds upon this capability to provide a robust, flexible and user-friendly NFAT that provides access to a greater range of forensic information in a far easier format

    Forensic investigation of cooperative storage cloud service: Symform as a case study

    Get PDF
    Researchers envisioned Storage as a Service (StaaS) as an effective solution to the distributed management of digital data. Cooperative storage cloud forensic is relatively new and is an under-explored area of research. Using Symform as a case study, we seek to determine the data remnants from the use of cooperative cloud storage services. In particular, we consider both mobile devices and personal computers running various popular operating systems, namely Windows 8.1, Mac OS X Mavericks 10.9.5, Ubuntu 14.04.1 LTS, iOS 7.1.2, and Android KitKat 4.4.4. Potential artefacts recovered during the research include data relating to the installation and uninstallation of the cloud applications, log-in to and log-out from Symform account using the client application, file synchronization as well as their time stamp information. This research contributes to an in-depth understanding of the types of terrestrial artifacts that are likely to remain after the use of cooperative storage cloud on client devices

    The Advanced Framework for Evaluating Remote Agents (AFERA): A Framework for Digital Forensic Practitioners

    Get PDF
    Digital forensics experts need a dependable method for evaluating evidence-gathering tools. Limited research and resources challenge this process and the lack of multi-endpoint data validation hinders reliability in distributed digital forensics. A framework was designed to evaluate distributed agent-based forensic tools while enabling practitioners to self-evaluate and demonstrate evidence reliability as required by the courts. Grounded in Design Science, the framework features guidelines, data, criteria, and checklists. Expert review enhances its quality and practicality

    Assessing the evidential value of artefacts recovered from the cloud

    Get PDF
    Cloud computing offers users low-cost access to computing resources that are scalable and flexible. However, it is not without its challenges, especially in relation to security. Cloud resources can be leveraged for criminal activities and the architecture of the ecosystem makes digital investigation difficult in terms of evidence identification, acquisition and examination. However, these same resources can be leveraged for the purposes of digital forensics, providing facilities for evidence acquisition, analysis and storage. Alternatively, existing forensic capabilities can be used in the Cloud as a step towards achieving forensic readiness. Tools can be added to the Cloud which can recover artefacts of evidential value. This research investigates whether artefacts that have been recovered from the Xen Cloud Platform (XCP) using existing tools have evidential value. To determine this, it is broken into three distinct areas: adding existing tools to a Cloud ecosystem, recovering artefacts from that system using those tools and then determining the evidential value of the recovered artefacts. From these experiments, three key steps for adding existing tools to the Cloud were determined: the identification of the specific Cloud technology being used, identification of existing tools and the building of a testbed. Stemming from this, three key components of artefact recovery are identified: the user, the audit log and the Virtual Machine (VM), along with two methodologies for artefact recovery in XCP. In terms of evidential value, this research proposes a set of criteria for the evaluation of digital evidence, stating that it should be authentic, accurate, reliable and complete. In conclusion, this research demonstrates the use of these criteria in the context of digital investigations in the Cloud and how each is met. This research shows that it is possible to recover artefacts of evidential value from XCP

    An evaluation of the ‘open source internet research tool’: a user-centred and participatory design approach with UK law enforcement

    Get PDF
    As part of their routine investigations, law enforcement conducts open source research; that is, investigating and researching using publicly available information online. Historically, the notion of collecting open sources of information is as ingrained as the concept of intelligence itself. However, utilising open source research in UK law enforcement is a relatively new concept not generally, or practically, considered until after the civil unrest seen in the UK’s major cities in the summer of 2011. While open source research focuses on the understanding of bein‘publicly available’, there are legal, ethical and procedural issues that law enforcement must consider. This asks the following mainresearch question: What constraints do law enforcement face when conducting open source research? From a legal perspective, law enforcement officials must ensure their actions are necessary and proportionate, more so where an individual’s privacy is concerned under human rights legislation and data protection laws such as the General Data Protection Regulation. Privacy issues appear, though, when considering the boom and usage of social media, where lines can be easily blurred as to what is public and private. Guidance from Association of Chief Police Officers (ACPO) and, now, the National Police Chief’s Council (NPCC) tends to be non-committal in tone, but nods towards obtaining legal authorisation under the Regulation of Investigatory Powers Act (RIPA) 2000 when conducting what may be ‘directed surveillance’. RIPA, however, pre-dates the modern era of social media by several years, so its applicability as the de-facto piece of legislation for conducting higher levels of open source research is called into question. 22 semi-structured interviews with law enforcement officials were conducted and discovered a grey area surrounding legal authorities when conducting open source research. From a technical and procedural aspect of conducting open source research, officers used a variety of software tools that would vary both in price and quality, with no standard toolset. This was evidenced from 20 questionnaire responses from 12 police forces within the UK. In an attempt to bring about standardisation, the College of Policing’s Research, Identifying and Tracing the Electronic Suspect (RITES) course recommended several capturing and productivity tools. Trainers on the RITES course, however, soon discovered the cognitive overload this had on the cohort, who would often spend more time learning to use the tools than learn about open source research techniques. The problem highlighted above prompted the creation of Open Source Internet Research Tool (OSIRT); an all-in-one browser for conducting open source research. OSIRT’s creation followed the user-centred design (UCD) method, with two phases of development using the software engineering methodologies ‘throwaway prototyping’, for the prototype version, and ‘incremental and iterative development’ for the release version. OSIRT has since been integrated into the RITES course, which trains over 100 officers a year, and provides a feedback outlet for OSIRT. System Usability Scale questionnaires administered on RITES courses have shown OSIRT to be usable, with feedback being positive. Beyond the RITES course, surveys, interviews and observations also show OSIRT makes an impact on everyday policing and has reduced the burden officers faced when conducting opens source research. OSIRT’s impact now reaches beyond the UK and sees usage across the globe. OSIRT contributes to law enforcement output in countries such as the USA, Canada, Australia and even Israel, demonstrating OSIRT’s usefulness and necessity are not only applicable to UK law enforcement. This thesis makes several contributions both academically and from a practical perspective to law enforcement. The main contributions are: • Discussion and analysis of the constraints law enforcement within the UK face when conducting open source research from a legal, ethical and procedural perspective. • Discussion, analysis and reflective discourse surrounding the development of a software tool for law enforcement and the challenges faced in what is a unique development. • An approach to collaborating with those who are in ‘closed’ environments, such as law enforcement, to create bespoke software. Additionally, this approach offers a method of measuring the value and usefulness of OSIRT with UK law enforcement. • The creation and integration of OSIRT in to law enforcement and law enforcement training packages

    A user-oriented network forensic analyser: The design of a high-level protocol analyser

    Get PDF
    Network forensics is becoming an increasingly important tool in the investigation of cyber and computer-assisted crimes. Unfortunately, whilst much effort has been undertaken in developing computer forensic file system analysers (e.g. Encase and FTK), such focus has not been given to Network Forensic Analysis Tools (NFATs). The single biggest barrier to effective NFATs is the handling of large volumes of low-level traffic and being able to exact and interpret forensic artefacts and their context – for example, being able extract and render application-level objects (such as emails, web pages and documents) from the low-level TCP/IP traffic but also understand how these applications/artefacts are being used. Whilst some studies and tools are beginning to achieve object extraction, results to date are limited to basic objects. No research has focused upon analysing network traffic to understand the nature of its use – not simply looking at the fact a person requested a webpage, but how long they spend on the application and what interactions did they have with whilst using the service (e.g. posting an image, or engaging in an instant message chat). This additional layer of information can provide an investigator with a far more rich and complete understanding of a suspect’s activities. To this end, this paper presents an investigation into the ability to derive high-level application usage characteristics from low-level network traffic meta-data. The paper presents a three application scenarios – web surfing, communications and social networking and demonstrates it is possible to derive the user interactions (e.g. page loading, chatting and file sharing ) within these systems. The paper continues to present a framework that builds upon this capability to provide a robust, flexible and user-friendly NFAT that provides access to a greater range of forensic information in a far easier format

    CuFA: A More Formal Definition for Digital Forensic Artifacts

    Get PDF
    The term “artifact” currently does not have a formal definition within the domain of cyber/ digital forensics, resulting in a lack of standardized reporting, linguistic understanding between professionals, and efficiency. In this paper we propose a new definition based on a survey we conducted, literature usage, prior definitions of the word itself, and similarities with archival science. This definition includes required fields that all artifacts must have and encompasses the notion of curation. Thus, we propose using a new term e curated forensic artifact (CuFA) e to address items which have been cleared for entry into a CuFA database (one implementation, the Artifact Genome Project, abbreviated as AGP, is under development and briefly outlined). An ontological model encapsulates these required fields while utilizing a lower-level taxonomic schema. We use the Cyber Observable eXpression (CybOX) project due to its rising popularity and rigorous classifications of forensic objects. Additionally, we suggest some improvements on its integration into our model and identify higher-level location categories to illustrate tracing an object from creation through investigative leads. Finally, a step-wise procedure for researching and logging CuFAs is devised to accompany the model

    Remote access forensics for VNC and RDP on Windows platform

    Get PDF
    There has been a greater implementation of remote access technologies in recent years. Many organisations are adapting remote technologies such as Virtual Network Computing (VNC) and remote desktop (RDP) applications as customer support application. They use these applications to remotely configure computers and solve computer and network issues of the client on spot. Therefore, the system administrator or the desktop technician does not have to sit on the client computer physically to solve a computer issue. This increase in adaptation of remote applications is of interest to forensic investigators; this is because illegal activities can be performed over the connection. The research will investigate whether remote protocols and applications do produce and leave valuable artefacts behind on Windows systems. The research aim to determine and retrieve any artefacts left behind remote protocols and applications in a forensic manner. Particular remote applications are selected to perform the research on and initial analysis will be performed on the applications to evaluate the potential forensic artefacts present on the computer system. The research will focus on Windows XP service packs 1, 2 & 3 for analysis of the remote applications and find out what artefacts if any are left behind these systems
    • …
    corecore