93 research outputs found

    Evaluation and Identification of Authentic Smartphone Data

    Get PDF
    Mobile technology continues to evolve in the 21st century, providing end-users with mobile devices that support improved capabilities and advance functionality. This ever-improving technology allows smartphone platforms, such as Google Android and Apple iOS, to become prominent and popular among end-users. The reliance on and ubiquitous use of smartphones render these devices rich sources of digital data. This data becomes increasingly important when smartphones form part of regulatory matters, security incidents, criminal or civil cases. Digital data is, however, susceptible to change and can be altered intentionally or accidentally by end-users or installed applications. It becomes, therefore, essential to evaluate the authenticity of data residing on smartphones before submitting the data as potential digital evidence. This thesis focuses on digital data found on smartphones that have been created by smartphone applications and the techniques that can be used to evaluate and identify authentic data. Identification of authentic smartphone data necessitates a better understanding of the smartphone, the related smartphone applications and the environment in which the smartphone operates. Derived from the conducted research and gathered knowledge are the requirements for authentic smartphone data. These requirements are captured in the smartphone data evaluation model to assist digital forensic professionals with the assessment of smartphone data. The smartphone data evaluation model, however, only stipulates how to evaluate the smartphone data and not what the outcome of the evaluation is. Therefore, a classification model is constructed using the identified requirements and the smartphone data evaluation model. The classification model presents a formal classification of the evaluated smartphone data, which is an ordered pair of values. The first value represents the grade of the authenticity of the data and the second value describes the completeness of the evaluation. Collectively, these models form the basis for the developed SADAC tool, a proof of concept digital forensic tool that assists with the evaluation and classification of smartphone data. To conclude, the evaluation and classification models are assessed to determine the effectiveness and efficiency of the models to evaluate and identify authentic smartphone data. The assessment involved two attack scenarios to manipulate smartphone data and the subsequent evaluation of the effects of these attack scenarios using the SADAC tool. The results produced by evaluating the smartphone data associated with each attack scenario confirmed the classification of the authenticity of smartphone data is feasible. Digital forensic professionals can use the provided models and developed SADAC tool to evaluate and identify authentic smartphone data. The outcome of this thesis provides a scientific and strategic approach for evaluating and identifying authentic smartphone data, offering needed assistance to digital forensic professionals. This research also adds to the field of digital forensics by providing insights into smartphone forensics, architectural components of smartphone applications and the nature of authentic smartphone data.Thesis (PhD)--University of Pretoria, 2019.Computer SciencePhDUnrestricte

    Methodology and automated metadata extraction from multiple volume shadow copies

    Get PDF
    Modern day digital forensics investigations rely on timelines as a principal method for normalizing and chronologically categorizing artifacts recovered from computer systems. Timelines provide investigators with a chronological representation of digital evidence so they can depict altered and unaltered digital forensics data in-context to drive conclusions about system events and/or user activities. While investigators rely on many system artifacts such as file system time/date stamps, operating system artifacts, program artifacts, logs, and/or registry artifacts as input for deriving chronological representations, using only the available or most recent version of the artifacts may provide a limited picture of historical changes on a system. For instance, if previous versions of artifacts and/or previous artifact metadata changes are overwritten and/or are not retained on a system, analysis of current versions of artifacts and artifact metadata, such as time/date stamps and operating system/program/registry artifacts, may provide only a limited picture of activities for the system. Recently, the Microsoft Windows Operating System implemented a backup mechanism that is capable of retaining multiple versions of data storage units for a system, effectively providing a highly-detailed record of system changes. This backup mechanism, the Windows Volume Shadow Copy Service (VSS), exists as a service of modern Microsoft Windows Operating Systems and allows data backups to be performed while applications on a system continue to write to the system\u27s live volume(s). This allows a running system to preserve the system\u27s state to backup media at any given point while the system continues to change in real-time. After multiple VSS backups are recorded, digital investigators now have the ability to incorporate multiple versions of a system\u27s artifacts into a chronological representation, which provides a more comprehensive picture of the system\u27s historical changes. In order to effectively incorporate VSS backup, or Volume Shadow Copy (VSC), data into a chronological representation, the data must be accessed and extracted in a consistent, repeatable, and, if possible, automated manner. Previous efforts have produced a variety of manual and semi-automated methods for accessing and extracting VSC data in a repeatable manner. These methods are time consuming and often require significant storage resources if dealing with multiple VSCs. The product of this research effort is the advancement of the methodology to automate accessing and extracting directory-tree and file attribute metadata from multiple VSCs of the Windows 7 Operating System. The approach extracts metadata from multiple VSCs and combines it as one conglomerate data set. By capturing the historical changes recorded within VSC metadata, this approach enhances timeline generation. Additionally, it supports other projects which could use the metadata to visualize change-over-time by depicting how the individual metadata and the conglomerate data set changed (or remained unchanged) throughout an arbitrary snapshot of time

    Introductory Computer Forensics

    Get PDF
    INTERPOL (International Police) built cybercrime programs to keep up with emerging cyber threats, and aims to coordinate and assist international operations for ?ghting crimes involving computers. Although signi?cant international efforts are being made in dealing with cybercrime and cyber-terrorism, ?nding effective, cooperative, and collaborative ways to deal with complicated cases that span multiple jurisdictions has proven dif?cult in practic

    Toolbox application to support and enhance the mobile device forensics investigation process - breaking through the techniques available

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceOne of the main topics that is discussed today is how can a person leverage on technology on a positive and secure way in order to enhance their daily life, making it a healthier, more productive, joyful and easier. However, with improvements in technology, comes challenges for which there is not yet a stable and safe way to overcome. One of the greatest challenges that people are faced has to do with their concern on their privacy and on the safeguard of their sensitive information that is stored in any device that one uses. In fact, one of the most used technology is the Mobile, which can take several forms, features, shapes, and many other components. In line manner, cybercrime is growing rapidly, targeting the exploitation and retrieval of information from these gadgets. Even so, with a Mobile, comes several challenges including a rapidly dynamic change in its landscape, an everincreasing diversity of mobile phones forms, integration of the information on a Mobile into the Cloud and IoT. As such, it’s vital to have a stable and safe toolbox that will enable a digital investigator to potentially prevent, detect and solve any issue that may be related to Mobile Device Forensics while solving out various investigations, being it criminal, civil, corporate or any other

    Tamper-Evident Data Provenance

    Get PDF
    Data Provenance describes what has happened to a users data within a ma- chine as a form of digital evidence. However this type of evidence is currently not admissible in courts of law, because the integrity of data provenance can- not be guaranteed. Tools which capture data provenance must either prevent, or be able to detect changes to the information they produce, i.e. tamper-proof or tamper-evident. Most current tools aim to be tamper-evident, and capture data provenance at a kernel level or higher. However, these tools do not provide a secure mechanism for transferring data provenance to a centralised location, while providing data integrity and confidentiality. In this thesis we propose a tamper-evident framework to fill this gap by using a widely-available hardware security chip: the Trusted Platform Module (TPM). We apply our framework to Progger, a cloud-based provenance logger, and demonstrate the completeness, confidentiality and admissibility require- ments for data provenance, enabling the information to be used as digital evidence in courts of law

    Análise de malware com suporte de hardware

    Get PDF
    Orientadores: Paulo Lício de Geus, André Ricardo Abed GrégioDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: O mundo atual é impulsionado pelo uso de sistemas computacionais, estando estes pre- sentes em todos aspectos da vida cotidiana. Portanto, o correto funcionamento destes é essencial para se assegurar a manutenção das possibilidades trazidas pelos desenvolvi- mentos tecnológicos. Contudo, garantir o correto funcionamento destes não é uma tarefa fácil, dado que indivíduos mal-intencionados tentam constantemente subvertê-los visando benefíciar a si próprios ou a terceiros. Os tipos mais comuns de subversão são os ataques por códigos maliciosos (malware), capazes de dar a um atacante controle total sobre uma máquina. O combate à ameaça trazida por malware baseia-se na análise dos artefatos coletados de forma a permitir resposta aos incidentes ocorridos e o desenvolvimento de contramedidas futuras. No entanto, atacantes têm se especializado em burlar sistemas de análise e assim manter suas operações ativas. Para este propósito, faz-se uso de uma série de técnicas denominadas de "anti-análise", capazes de impedir a inspeção direta dos códigos maliciosos. Dentre essas técnicas, destaca-se a evasão do processo de análise, na qual são empregadas exemplares capazes de detectar a presença de um sistema de análise para então esconder seu comportamento malicioso. Exemplares evasivos têm sido cada vez mais utilizados em ataques e seu impacto sobre a segurança de sistemas é considerá- vel, dado que análises antes feitas de forma automática passaram a exigir a supervisão de analistas humanos em busca de sinais de evasão, aumentando assim o custo de se manter um sistema protegido. As formas mais comuns de detecção de um ambiente de análise se dão através da detecção de: (i) código injetado, usado pelo analista para inspecionar a aplicação; (ii) máquinas virtuais, usadas em ambientes de análise por questões de escala; (iii) efeitos colaterais de execução, geralmente causados por emuladores, também usados por analistas. Para lidar com malware evasivo, analistas tem se valido de técnicas ditas transparentes, isto é, que não requerem injeção de código nem causam efeitos colaterais de execução. Um modo de se obter transparência em um processo de análise é contar com suporte do hardware. Desta forma, este trabalho versa sobre a aplicação do suporte de hardware para fins de análise de ameaças evasivas. No decorrer deste texto, apresenta-se uma avaliação das tecnologias existentes de suporte de hardware, dentre as quais máqui- nas virtuais de hardware, suporte de BIOS e monitores de performance. A avaliação crítica de tais tecnologias oferece uma base de comparação entre diferentes casos de uso. Além disso, são enumeradas lacunas de desenvolvimento existentes atualmente. Mais que isso, uma destas lacunas é preenchida neste trabalho pela proposição da expansão do uso dos monitores de performance para fins de monitoração de malware. Mais especificamente, é proposto o uso do monitor BTS para fins de construção de um tracer e um debugger. O framework proposto e desenvolvido neste trabalho é capaz, ainda, de lidar com ataques do tipo ROP, um dos mais utilizados atualmente para exploração de vulnerabilidades. A avaliação da solução demonstra que não há a introdução de efeitos colaterais, o que per- mite análises de forma transparente. Beneficiando-se desta característica, demonstramos a análise de aplicações protegidas e a identificação de técnicas de evasãoAbstract: Today¿s world is driven by the usage of computer systems, which are present in all aspects of everyday life. Therefore, the correct working of these systems is essential to ensure the maintenance of the possibilities brought about by technological developments. However, ensuring the correct working of such systems is not an easy task, as many people attempt to subvert systems working for their own benefit. The most common kind of subversion against computer systems are malware attacks, which can make an attacker to gain com- plete machine control. The fight against this kind of threat is based on analysis procedures of the collected malicious artifacts, allowing the incident response and the development of future countermeasures. However, attackers have specialized in circumventing analysis systems and thus keeping their operations active. For this purpose, they employ a series of techniques called anti-analysis, able to prevent the inspection of their malicious codes. Among these techniques, I highlight the analysis procedure evasion, that is, the usage of samples able to detect the presence of an analysis solution and then hide their malicious behavior. Evasive examples have become popular, and their impact on systems security is considerable, since automatic analysis now requires human supervision in order to find evasion signs, which significantly raises the cost of maintaining a protected system. The most common ways for detecting an analysis environment are: i) Injected code detec- tion, since injection is used by analysts to inspect applications on their way; ii) Virtual machine detection, since they are used in analysis environments due to scalability issues; iii) Execution side effects detection, usually caused by emulators, also used by analysts. To handle evasive malware, analysts have relied on the so-called transparent techniques, that is, those which do not require code injection nor cause execution side effects. A way to achieve transparency in an analysis process is to rely on hardware support. In this way, this work covers the application of the hardware support for the evasive threats analysis purpose. In the course of this text, I present an assessment of existing hardware support technologies, including hardware virtual machines, BIOS support, performance monitors and PCI cards. My critical evaluation of such technologies provides basis for comparing different usage cases. In addition, I pinpoint development gaps that currently exists. More than that, I fill one of these gaps by proposing to expand the usage of performance monitors for malware monitoring purposes. More specifically, I propose the usage of the BTS monitor for the purpose of developing a tracer and a debugger. The proposed framework is also able of dealing with ROP attacks, one of the most common used technique for remote vulnerability exploitation. The framework evaluation shows no side-effect is introduced, thus allowing transparent analysis. Making use of this capability, I demonstrate how protected applications can be inspected and how evasion techniques can be identifiedMestradoCiência da ComputaçãoMestre em Ciência da ComputaçãoCAPE

    Analysing system behaviour by automatic benchmarking of system-level provenance

    Get PDF
    Provenance is a term originating from the work of art. It aims to provide a chain of information of a piece of arts from its creation to the current status. It records all the historic information relating to this piece of art, including the storage locations, ownership, buying prices, etc. until the current status. It has a very similar definition in data processing and computer science. It is used as the lineage of data in computer science to provide either reproducibility or tracing of activities happening in runtime for a different purpose. Similar to the provenance used in art, provenance used in computer science and data processing field describes how a piece of data was created, passed around, modified, and reached the current state. Also, it provides information on who is responsible for certain activities and other related information. It acts as metadata on components in a computer environment. As the concept of provenance is to record all related information of some data, the size of provenance itself is generally proportional to the amount of data processing that took place. It generally tends to be a large set of data and is hard to analyse. Also, in the provenance collecting process, not all information is useful for all purposes. For example, if we just want to trace all previous owners of a file, then all the storage location information may be ignored. To capture useful information and without needing to handle a large amount of information, researchers and developers develop different provenance recording tools that only record information needed by particular applications with different means and mechanisms throughout the systems. This action allows a lighter set of information for analysis but it results in non-standard provenance information and general users may not have a clear view on which tools are better for some purposes. For example, if we want to identify if certain action sequences have been performed in a process and who is accountable for these actions for security analysis, we have no idea which tools should be trusted to provide the correct set of information. Also, it is hard to compare the tools as there is not much common standard around. With the above need in mind, this thesis concentrate on providing an automated system ProvMark to benchmark the tools. This helps to show the strengths and weaknesses of their provenance results in different scenarios. It also allows tool developers to verify their tools and allows end-users to compare the tools at the same level to choose a suitable one for the purpose. As a whole, the benchmarking based on the expressiveness of the tools on different scenarios shows us the right choice of provenance tools on specific usage
    corecore