5,349 research outputs found

    A framework for the forensic investigation of unstructured email relationship data

    Get PDF
    Our continued reliance on email communications ensures that it remains a major source of evidence during a digital investigation. Emails comprise both structured and unstructured data. Structured data provides qualitative information to the forensics examiner and is typically viewed through existing tools. Unstructured data is more complex as it comprises information associated with social networks, such as relationships within the network, identification of key actors and power relations, and there are currently no standardised tools for its forensic analysis. Moreover, email investigations may involve many hundreds of actors and thousands of messages. This paper posits a framework for the forensic investigation of email data. In particular, it focuses on the triage and analysis of unstructured data to identify key actors and relationships within an email network. This paper demonstrates the applicability of the approach by applying relevant stages of the framework to the Enron email corpus. The paper illustrates the advantage of triaging this data to identify (and discount) actors and potential sources of further evidence. It then applies social network analysis techniques to key actors within the data set. This paper posits that visualisation of unstructured data can greatly aid the examiner in their analysis of evidence discovered during an investigation

    Lines-of-inquiry and sources of evidence in work-based research

    Get PDF
    There is synergy between the investigative practices of police detectives and social scientists, including work-based researchers. They both develop lines-of-inquiry and draw on multiple sources of evidence in order to make inferences about people, trends and phenomena. However, the principles associated with lines-of-inquiry and sources of evidence have not so far been examined in relation to work-based research methods, which are often unexplored or ill-defined in the published literature. We explore this gap by examining the various direct and indirect lines-of-inquiry and the main sources of primary and secondary evidence used in work-based research, which is especially relevant because some work-based researchers are also police detectives. Clearer understanding of these intersections will be useful in emerging professional contexts where the work-based researcher, the detective, and the social scientist cohere in the one person and their research project. The case we examined was a Professional Studies programme at a university in Australia, which has many police detectives doing work-based research, and from their experience we conclude there is synergy between work-based research and lines of enquiry. Specifically, in the context of research methods, we identify seven sources of evidence: 1) creative, unstructured, and semi-structured interviews; 2) structured interviews; 3) consensus group methods; 4) surveys; 5) documentation and archives; 6) direct observations and participant observations; and 7) physical or cultural artefacts, and show their methodological features related to data and method type, reliability, validity, and types of analysis, along with their respective advantages and disadvantages. This study thereby unpacks and isolates those characteristics of work-based research which are relevant to a growing body of literature related to the messy, co-produced and wicked problems of private companies, government agencies, and non-government organisations and the research methods used to investigate them

    Procedures and tools for acquisition and analysis of volatile memory on android smartphones

    Get PDF
    Mobile phone forensics have become more prominent since mobile phones have become ubiquitous both for personal and business practice. Android smartphones show tremendous growth in the global market share. Many researchers and works show the procedures and techniques for the acquisition and analysis the non-volatile memory inmobile phones. On the other hand, the physical memory (RAM) on the smartphone might retain incriminating evidence that could be acquired and analysed by the examiner. This study reveals the proper procedure for acquiring the volatile memory inthe Android smartphone and discusses the use of Linux Memory Extraction (LiME) for dumping the volatile memory. The study also discusses the analysis process of the memory image with Volatility 2.3, especially how the application shows its capability analysis. Despite its advancement there are two major concerns for both applications. First, the examiners have to gain root privileges before executing LiME. Second, both applications have no generic solution or approach. On the other hand, currently there is no other tool or option that might give the same result as LiME and Volatility 2.3

    A platform for discovering and sharing confidential ballistic crime data.

    Get PDF
    Criminal investigations generate large volumes of complex data that detectives have to analyse and understand. This data tends to be "siloed" within individual jurisdictions and re-using it in other investigations can be difficult. Investigations into trans-national crimes are hampered by the problem of discovering relevant data held by agencies in other countries and of sharing those data. Gun-crimes are one major type of incident that showcases this: guns are easily moved across borders and used in multiple crimes but finding that a weapon was used elsewhere in Europe is difficult. In this paper we report on the Odyssey Project, an EU-funded initiative to mine, manipulate and share data about weapons and crimes. The project demonstrates the automatic combining of data from disparate repositories for cross-correlation and automated analysis. The data arrive from different cultural/domains with multiple reference models using real-time data feeds and historical databases

    Catching the Banksters: The Use of Big Data Analytics in Billion Dollar Regulatory Investigations

    Get PDF
    Following the financial crisis, emboldened regulators have increased the magnitude of fines levied for financial malfeasance. The automation of the data discovery process underpins the rise in internal investigations, which financial organizations are obliged to conduct on the behest of regulators, keen to reduce information asymmetries and bolster transparency. Yet little research exists into the technologies which underpin post-crisis regulatory agendas. Our study focuses on big data technologies (eDiscovery tools) which facilitate investigations, where rare yet serious breaches have occurred. We focus on the micro/data level (volume, veracity, variety and velocity) to understand how these tools are influencing regulatory outcomes. The findings illustrate the need for financial organizations to adopt robust information governance policies to ease future investigatory efforts. We identify various practices which may help compliance managers better respond to regulatory investigations faster and more easily to ease the burden of post-crisis regulation

    A systematic survey of online data mining technology intended for law enforcement

    Get PDF
    As an increasing amount of crime takes on a digital aspect, law enforcement bodies must tackle an online environment generating huge volumes of data. With manual inspections becoming increasingly infeasible, law enforcement bodies are optimising online investigations through data-mining technologies. Such technologies must be well designed and rigorously grounded, yet no survey of the online data-mining literature exists which examines their techniques, applications and rigour. This article remedies this gap through a systematic mapping study describing online data-mining literature which visibly targets law enforcement applications, using evidence-based practices in survey making to produce a replicable analysis which can be methodologically examined for deficiencies

    Forensic triage of email network narratives through visualisation

    Get PDF
    Purpose – The purpose of this paper is to propose a novel approach that automates the visualisation of both quantitative data (the network) and qualitative data (the content) within emails to aid the triage of evidence during a forensics investigation. Email remains a key source of evidence during a digital investigation, and a forensics examiner may be required to triage and analyse large email data sets for evidence. Current practice utilises tools and techniques that require a manual trawl through such data, which is a time-consuming process. Design/methodology/approach – This paper applies the methodology to the Enron email corpus, and in particular one key suspect, to demonstrate the applicability of the approach. Resulting visualisations of network narratives are discussed to show how network narratives may be used to triage large evidence data sets. Findings – Using the network narrative approach enables a forensics examiner to quickly identify relevant evidence within large email data sets. Within the case study presented in this paper, the results identify key witnesses, other actors of interest to the investigation and potential sources of further evidence. Practical implications – The implications are for digital forensics examiners or for security investigations that involve email data. The approach posited in this paper demonstrates the triage and visualisation of email network narratives to aid an investigation and identify potential sources of electronic evidence. Originality/value – There are a number of network visualisation applications in use. However, none of these enable the combined visualisation of quantitative and qualitative data to provide a view of what the actors are discussing and how this shapes the network in email data sets

    The value of modus operandi in fraud investigation : a short-term insurance industry perspective

    Get PDF
    This study sought to examine the value of modus operandi (MO) information in the investigation of short-term insurance fraud. A comprehensive literature study was conducted concerning the dynamics of MO information in forensic investigation and short-term insurance fraud in South Africa and internationally, and individual semi-structured interviews were conducted with forensic investigators at Santam and MiWay to promote knowledge and understanding of the importance of MO information in short-term insurance fraud investigations. Results of this research indicate that participants did grasp the significance of MO information in the investigation of short-term insurance fraud. It is, however, apparent that they did not optimally exploit MO information regarding insurance fraud as a result of limited experience, ineffective databases and the inaccessibility of available data – all of which prevent the improvement of utilising MO data pertaining to short-term insurance fraud. Forensic investigators in the short-term insurance industry isolate themselves from each other and fail to share the available MO information amongst each other, resulting in a non-systematic fragmented approach to short-term insurance fraud investigation. The study identifies the challenges and shortcomings experienced by forensic investigators at Santam and MiWay that prevent the optimal utilisation of MO information in the investigation of short-term insurance fraud. The study then suggests a set of recommendations that could assist forensic investigators and other role-players in enhancing the utilisation of such information.Criminology and Security ScienceM. Tech. (Forensic Investigation
    corecore