42 research outputs found

    On the genesis of computer forensis

    Get PDF
    This thesis presents a coherent set of research contributions to the new discipline of computer forensis. It analyses emergence of computer forensis and defines challenges facing this discipline, carries forward research advances in conventional methodology, introduces novel approach to using virtual environments in forensis, and systemises the computer forensis body of knowledge leading to the establishment of tertiary curriculum. The emergence of computer forensis as a separate discipline of science was triggered by evolution and growth of computer crime. Computer technology reached a stage when a conventional, mechanistic approach to collecting and analysing data is insufficient: the existing methodology must be formalised, and embrace technologies and methods that will enable the inclusion of transient data and live systems analysis. Further work is crucial to incorporate advances in related disciplines like computer security and information systems audit, as well as developments in operating systems to make computer forensics issues inherent in their design. For example: it is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. The analysis of permanent data storage is fundamental to computer forensics practice. There is very little finalised, and a lot still to be discovered in the conventional computer forensics methodology. This thesis contributes to formalisation and improved integrity of forensic handling of data storage by: formalising methods for data collection and analysis in NTFS (Microsoft file system) environment: presenting safe methodology for handling data backups in order to avoid information loss where Alternate Data Streams (ADS) are present: formalising methods of hiding and extracting hidden and encrypted data. A significant contribution of this thesis is in the field of application of virtualisation, or simulation of the computer in the virtual environment created by the underlying hardware and software, to computer forensics practice. Computer systems are not easily analysed for forensic purpose, and it is demonstrated that virtualisation applied in computer forensics allows for more efficient and accurate identification and analysis of the evidence. A new method is proposed where two environments used in parallel can bring faster and verifiable results not dependent on proprietary, close source tools and may lead to gradual shift from commercial Windows software to open source software (OSS). The final contribution of this thesis is systemising the body of knowledge in computer forensics, which is a necessary condition for it to become an established discipline of science. This systemisation led to design and development of tertiary curriculum in computer forensics illustrated here with a case study of computer forensics major for Bachelor of Computer Science at University of Western Sydney. All genesis starts as an idea. A natural part of scientific research process is replacing previous assumptions, concepts, and practices with new ones which better approximate the truth. This thesis advances computer forensis body of knowledge in the areas which are crucial to further development of this discipline. Please note that the appendices to this thesis consist of separately published items which cannot be made available due to copyright restrictions. These items are listed in the PDF attachment for reference purposes

    Detection and Identification of Software Encryption Solutions in NT-based Microsoft Windows Operating Systems

    Get PDF
    As encrypted information is very difficult or impossible to reconstruct, there are many situations in which it is critical to detect the presence of encryption software before a computer is shut down. Currently there is no solution that reliably identifies installed encryption software.;For this investigation, thirty encryption software products for Microsoft Windows based on the NT-kernel have been identified and investigated. Operating system dependent factors such as registry, file attributes, operating system attributes, process list analysis and independent factors such as file headers, keyword search, Master Boot Record analysis as well as hashing of software components were investigated and allow the identification of these programs. The most reliable detection rate is achieved through a combination of the aforementioned factors

    Introductory Computer Forensics

    Get PDF
    INTERPOL (International Police) built cybercrime programs to keep up with emerging cyber threats, and aims to coordinate and assist international operations for ?ghting crimes involving computers. Although signi?cant international efforts are being made in dealing with cybercrime and cyber-terrorism, ?nding effective, cooperative, and collaborative ways to deal with complicated cases that span multiple jurisdictions has proven dif?cult in practic

    Digital evidence bags

    Get PDF
    This thesis analyses the traditional approach and methodology used to conduct digital forensic information capture, analysis and investigation. The predominant toolsets and utilities that are used and the features that they provide are reviewed. This is used to highlight the difficulties that are encountered due to both technological advances and the methodologies employed. It is suggested that these difficulties are compounded by the archaic methods and proprietary formats that are used. An alternative framework for the capture and storage of information used in digital forensics is defined named the `Digital Evidence Bag' (DEB). A DEB is a universal extensible container for the storage of digital information acquired from any digital source. The format of which can be manipulated to meet the requirements of the particular information that is to be stored. The format definition is extensible thereby allowing it to encompass new sources of data, cryptographic and compression algorithms and protocols as developed, whilst also providing the flexibility for some degree of backwards compatibility as the format develops. The DEB framework utilises terminology to define its various components that are analogous with evidence bags, tags and seals used for traditional physical evidence storage and continuity. This is crucial for ensuring that the functionality provided by each component is comprehensible by the general public, judiciary and law enforcement personnel without detracting or obscuring the evidential information contained within. Furthermore, information can be acquired from a dynamic or more traditional static environment and from a disparate range of digital devices. The flexibility of the DEB framework permits selective and/or intelligent acquisition methods to be employed together with enhanced provenance and continuity audit trails to be recorded. Evidential integrity is assured using accepted cryptographic techniques and algorithms. The DEB framework is implemented in a number of tool demonstrators and applied to a number of typical scenarios that illustrate the flexibility of the DEB framework and format. The DEB framework has also formed the basis of a patent application

    The development of an open-source forensics platform

    Get PDF
    The rate at which technology evolves by far outpaces the rate at which methods are developed to prevent and prosecute digital crime. This unfortunate situation may potentially allow computer criminals to commit crimes using technologies for which no proper forensic investigative technique currently exists. Such a scenario would ultimately allow criminals to go free due to the lack of evidence to prove their guilt. A solution to this problem would be for law enforcement agencies and governments to invest in the research and development of forensic technologies in an attempt to keep pace with the development of digital technologies. Such an investment could potentially allow new forensic techniques to be developed and released more frequently, thus matching the appearance of new computing devices on the market. A key element in improving the situation is to produce more research results, utilizing less resources, and by performing research more efficiently. This can be achieved by improving the process used to conduct forensic research. One of the problem areas in research and development is the development of prototypes to prove a concept or to test a hypothesis. An in-depth understanding of the extremely technical aspects of operating systems, such as file system structures and memory management, is required to allow forensic researchers to develop prototypes to prove their theories and techniques. The development of such prototypes is an extremely challenging task. It is complicated by the presence of minute details that, if ignored, may have a negative impact on the accuracy of results produced. If some of the complexities experienced in the development of prototypes could simply be removed from the equation, researchers may be able to produce more and better results with less effort, and thus ultimately speed up the forensic research process. This dissertation describes the development of a platform that facilitates the rapid development of forensic prototypes, thus allowing researchers to produce such prototypes utilizing less time and fewer resources. The purpose of the platform is to provide a set of rich features which are likely to be required by developers performing research prototyping. The proposed platform contributes to the development of prototypes using fewer resources and at a faster pace. The development of the platform, as well as various considerations that helped to shape its architecture and design, are the focus points of this dissertation. Topics such as digital forensic investigations, open-source software development, and the development of the proposed forensic platform are discussed. Another purpose of this dissertation is to serve as a proof-of-concept for the developed platform. The development of a selection of forensics prototypes, as well as the results obtained, are also discussed. CopyrightDissertation (MSc)--University of Pretoria, 2009.Computer Scienceunrestricte

    ThumbScan: A lightweight thumbnail search tool

    Get PDF
    Since the introduction of Windows 95B, Microsoft users have been able to select a thumbnail view of any system folder. This option prompts the operating system to create a miniature preview of each file. By default, these generated images are archived to a local thumbnail database for quick system retrieval. Once an image is placed in the database, it will never be removed. By viewing the contents of thumbnail databases, a forensic investigator can easily examine the past and present media of a given system. Though this cache is not a perfect record, it is a good indicator of media storage locations and habits. For these reasons, we present ThumbScan, an automated search tool for locating and analyzing the archived thumbnails of modern Windows systems

    IPCFA: A Methodology for Acquiring Forensically-Sound Digital Evidence in the Realm of IAAS Public Cloud Deployments

    Get PDF
    Cybercrimes and digital security breaches are on the rise: savvy businesses and organizations of all sizes must ready themselves for the worst. Cloud computing has become the new normal, opening even more doors for cybercriminals to commit crimes that are not easily traceable. The fast pace of technology adoption exceeds the speed by which the cybersecurity community and law enforcement agencies (LEAs) can invent countermeasures to investigate and prosecute such criminals. While presenting defensible digital evidence in courts of law is already complex, it gets more complicated if the crime is tied to public cloud computing, where storage, network, and computing resources are shared and dispersed over multiple geographical areas. Investigating such crimes involves collecting evidence data from the public cloud that is court-sound. Digital evidence court admissibility in the U.S. is governed predominantly by the Federal Rules of Evidence and Federal Rules of Civil Procedures. Evidence authenticity can be challenged by the Daubert test, which evaluates the forensic process that took place to generate the presented evidence. Existing digital forensics models, methodologies, and processes have not adequately addressed crimes that take place in the public cloud. It was only in late 2020 that the Scientific Working Group on Digital Evidence (SWGDE) published a document that shed light on best practices for collecting evidence from cloud providers. Yet SWGDE’s publication does not address the gap between the technology and the legal system when it comes to evidence admissibility. The document is high level with more focus on law enforcement processes such as issuing a subpoena and preservation orders to the cloud provider. This research proposes IaaS Public Cloud Forensic Acquisition (IPCFA), a methodology to acquire forensic-sound evidence from public cloud IaaS deployments. IPCFA focuses on bridging the gap between the legal and technical sides of evidence authenticity to help produce admissible evidence that can withstand scrutiny in U.S. courts. Grounded in design research science (DSR), the research is rigorously evaluated using two hypothetical scenarios for crimes that take place in the public cloud. The first scenario takes place in AWS and is hypothetically walked-thru. The second scenario is a demonstration of IPCFA’s applicability and effectiveness on Azure Cloud. Both cases are evaluated using a rubric built from the federal and civil digital evidence requirements and the international best practices for iv digital evidence to show the effectiveness of IPCFA in generating cloud evidence sound enough to be considered admissible in court

    Cognitive and affective processes associated with moral reasoning, and their relationship with behaviour in typical development

    Get PDF
    Objective: Moral reasoning (MR) reflects rationalisation in the moral domain, which matures across development and is underpinned by cognitive and affective processes. Although MR is associated with offending behaviours the mechanisms for this association are unknown. Examining the role of cognitive and affective processes in MR, and their influence on behaviour, may enhance existing psychological interventions that aim to reduce offending behaviours, and facilitate the development of novel targeted interventions. The current study investigated the hypothesis that MR would mediate the relationship between executive functions (EFs) and behaviour, and between empathy and behaviour. Method: In a cross-sectional design, typically developing adolescents (n = 72) individually completed an assessment battery, including the sociomoral reflection measure-short form, neuropsychological measures of working memory and cognitive flexibility/inhibition, and self-report questionnaires of empathy and behaviour. The battery also contained an assessment of intellectual functioning, and obtained data on socioeconomic status and age as confounding variables. Results: MR was not associated with self-report behaviour and, therefore, did not mediate the relationship between EFs/empathy and self-reported behaviour. A novel relationship was demonstrated between working memory and MR, and cognitive flexibility/inhibition was associated with MR. Self-report empathy was not associated with MR. Exploratory analyses suggested that intelligence and EFs were significant unique predictors of MR, and that truth and law moral values were associated with self-reported behavioural difficulties. Conclusions: Findings suggest that global MR is not associated with self-reported behaviour in typically developing adolescents, however, there may be an association between some moral values and self-reported behaviour. Findings also suggested that empathy is not associated with MR in this population, which warrants further investigation. These findings have implications for theoretical models of MR, and psychological intervention programmes. Recommendations for future research are presented

    An Evaluation of Forensic Tools for Linux : Emphasizing EnCase and PyFlag

    Get PDF
    Denne masteroppgaven gir en vurdering og sammenligning av flere datakriminaltekniske verktøy, med et spesielt fokus på to spesifikke verktøy. Det første kalles EnCase Forensics og er et kommersielt tilgjengelig verktøy som blir benyttet av politi og myndigheter flere steder i verden. Det andre kalles PyFlag og er et open source alternativ som ble benyttet i det vinnende bidraget til Digital Forensics Research Workshop (DFRWS) i 2008. Selv om verktøyene blir evaluert i sin helhet, vil hovedfokuset ligge på viktig søkefunksjonalitet. Tatt i betraktning at mesteparten av forskningen innen området er basert på Microsoft Windows plattformen, mens mindre forskning har blitt utført angående analyse av Linux systemer, så undersøker vi disse verktøyene hovedsakelig i et Linux miljø. Med disse verktøyene utfører vi datakriminalteknisk utvinning og analyse av realistiske data. I tillegg benyttes et verktøy med navn dd, for å utvinne data fra Linux. Denne masteroppgaven inneholder spesifiserte testprosedyrer, problemer vi støtte på under selve testingen, og de endelige resultatene
    corecore