8 research outputs found

    Ensuring data confidentiality via plausibly deniable encryption and secure deletion – a survey

    Get PDF
    Ensuring confidentiality of sensitive data is of paramount importance, since data leakage may not only endanger dataowners’ privacy, but also ruin reputation of businesses as well as violate various regulations like HIPPA andSarbanes-Oxley Act. To provide confidentiality guarantee, the data should be protected when they are preserved inthe personal computing devices (i.e.,confidentiality duringtheirlifetime); and also, they should be rendered irrecoverableafter they are removed from the devices (i.e.,confidentiality after their lifetime). Encryption and secure deletion are usedto ensure data confidentiality during and after their lifetime, respectively.This work aims to perform a thorough literature review on the techniques being used to protect confidentiality of thedata in personal computing devices, including both encryption and secure deletion. Especially for encryption, wemainly focus on the novel plausibly deniable encryption (PDE), which can ensure data confidentiality against both acoercive (i.e., the attacker can coerce the data owner for the decryption key) and a non-coercive attacker

    A generic database forensic investigation process model

    Get PDF
    Database Forensic investigation is a domain which deals with database contents and their metadata to reveal malicious activities on database systems. Even though it is still new, but due to the overwhelming challenges and issues in the domain, this makes database forensic become a fast growing and much sought after research area. Based on observations made, we found that database forensic suffers from having a common standard which could unify knowledge of the domain. Therefore, through this paper, we present the use of Design Science Research (DSR) as a research methodology to develop a Generic Database Forensic Investigation Process Model (DBFIPM). From the creation of DBFIPM, five common forensic investigation processes have been proposed namely, the i) identification, ii) collection, iii) preservation, iv) analysis and v) presentation process. From the DBFIPM, it allows the reconciliation of concepts and terminologies of all common databases forensic investigation processes. Thus, this will potentially facilitate the sharing of knowledge on database forensic investigation among domain stakeholders

    Evaluation and Identification of Authentic Smartphone Data

    Get PDF
    Mobile technology continues to evolve in the 21st century, providing end-users with mobile devices that support improved capabilities and advance functionality. This ever-improving technology allows smartphone platforms, such as Google Android and Apple iOS, to become prominent and popular among end-users. The reliance on and ubiquitous use of smartphones render these devices rich sources of digital data. This data becomes increasingly important when smartphones form part of regulatory matters, security incidents, criminal or civil cases. Digital data is, however, susceptible to change and can be altered intentionally or accidentally by end-users or installed applications. It becomes, therefore, essential to evaluate the authenticity of data residing on smartphones before submitting the data as potential digital evidence. This thesis focuses on digital data found on smartphones that have been created by smartphone applications and the techniques that can be used to evaluate and identify authentic data. Identification of authentic smartphone data necessitates a better understanding of the smartphone, the related smartphone applications and the environment in which the smartphone operates. Derived from the conducted research and gathered knowledge are the requirements for authentic smartphone data. These requirements are captured in the smartphone data evaluation model to assist digital forensic professionals with the assessment of smartphone data. The smartphone data evaluation model, however, only stipulates how to evaluate the smartphone data and not what the outcome of the evaluation is. Therefore, a classification model is constructed using the identified requirements and the smartphone data evaluation model. The classification model presents a formal classification of the evaluated smartphone data, which is an ordered pair of values. The first value represents the grade of the authenticity of the data and the second value describes the completeness of the evaluation. Collectively, these models form the basis for the developed SADAC tool, a proof of concept digital forensic tool that assists with the evaluation and classification of smartphone data. To conclude, the evaluation and classification models are assessed to determine the effectiveness and efficiency of the models to evaluate and identify authentic smartphone data. The assessment involved two attack scenarios to manipulate smartphone data and the subsequent evaluation of the effects of these attack scenarios using the SADAC tool. The results produced by evaluating the smartphone data associated with each attack scenario confirmed the classification of the authenticity of smartphone data is feasible. Digital forensic professionals can use the provided models and developed SADAC tool to evaluate and identify authentic smartphone data. The outcome of this thesis provides a scientific and strategic approach for evaluating and identifying authentic smartphone data, offering needed assistance to digital forensic professionals. This research also adds to the field of digital forensics by providing insights into smartphone forensics, architectural components of smartphone applications and the nature of authentic smartphone data.Thesis (PhD)--University of Pretoria, 2019.Computer SciencePhDUnrestricte

    Integrated examination and analysis model for improving mobile cloud forensic investigation

    Get PDF
    Advanced forensic techniques become inevitable to investigate the malicious activities in Cloud-based Mobile Applications (CMA). It is challenging to analyse the casespecific evidential artifact from the Mobile Cloud Computing (MCC) environment under forensically sound conditions. The Mobile Cloud Investigation (MCI) encounters many research issues in tracing and fine-tuning the relevant evidential artifacts from the MCC environment. This research proposes an integrated Examination and Analysis (EA) model for a generalised application architecture of CMA deployable on the public cloud to trace the case-specific evidential artifacts. The proposed model effectively validates MCI and enhances the accuracy and speed of the investigation. In this context, proposing Forensic Examination and Analysis Methodology using Data mining (FED) and Forensic Examination and analysis methodology using Data mining and Optimization (FEDO) models address these issues. The FED incorporates key sub-phases such as timeline analysis, hash filtering, data carving, and data transformation to filter out case-specific artifacts. The Long Short-Term Memory (LSTM) assisted forensic methodology decides the amount of potential information to be retained for further investigation and categorizes the forensic evidential artifacts for the relevancy of the crime event. Finally, the FED model constructs the forensic evidence taxonomy and maintains the precision and recall above 85% for effective decision-making. FEDO facilitates cloud evidence by examining the key features and indexing the evidence. The FEDO incorporates several sub-phases to precisely handle the evidence, such as evidence indexing, crossreferencing, and keyword searching. It analyses the temporal and geographic information and performs cross-referencing to fine-tune the evidence towards the casespecific evidence. FEDO models the Linearly Decreasing Weight (LDW) strategy based Particle Swarm Optimization (PSO) algorithm on the case-specific evidence to improve the searching capability of the investigation across the massive MCC environment. FEDO delivers the evidence tracing rate at 90%, and thus the integrated EA ensures improved MCI performance

    Introductory Computer Forensics

    Get PDF
    INTERPOL (International Police) built cybercrime programs to keep up with emerging cyber threats, and aims to coordinate and assist international operations for ?ghting crimes involving computers. Although signi?cant international efforts are being made in dealing with cybercrime and cyber-terrorism, ?nding effective, cooperative, and collaborative ways to deal with complicated cases that span multiple jurisdictions has proven dif?cult in practic

    A comparison of open source and proprietary digital forensic software

    Get PDF
    Scrutiny of the capabilities and accuracy of computer forensic tools is increasing as the number of incidents relying on digital evidence and the weight of that evidence increase. This thesis describes the capabilities of the leading proprietary and open source digital forensic tools. The capabilities of the tools were tested separately on digital media that had been formatted using Windows and Linux. Experiments were carried out with the intention of establishing whether the capabilities of open source computer forensics are similar to those of proprietary computer forensic tools, and whether these tools could complement one another. The tools were tested with regards to their capabilities to make and analyse digital forensic images in a forensically sound manner. The tests were carried out on each media type after deleting data from the media, and then repeated after formatting the media. The results of the experiments performed demonstrate that both proprietary and open source computer forensic tools have superior capabilities in different scenarios, and that the toolsets can be used to validate and complement one another. The implication of these findings is that investigators have an affordable means of validating their findings and are able to more effectively investigate digital media
    corecore