170 research outputs found

    An exploration of virtual criminal investigations in Ghana : legal issues and challenges

    Get PDF
    The widespread cybercrime has caused changes and brought about a need for new investigative skills, laws and enforcement procedures to attack these obstacles. Since technological crimes committed through the information superhighway or the internet is evolving very rapidly, efficacious enforcement of cybercrime is becoming extremely challenging. Cybercrime is both a national and international issue and local legislation alone cannot be able to combat the menace. Digital evidence permeates every aspect of the average person's life in today's society and no matter what you are doing these days, a digital footprint is probably being created and contains some type of digital evidence that can be recovered through digital forensic investigation It requires stringent laws, skilled personnel, well-established institutions, and transnational response. To efficaciously combat cybercrime, countries, states or governments must establish an independent anti-cybercrime unit and design national guidelines for digital evidence collections to combat the canker. This thesis, therefore, presents an examination of the virtual crime or cybercrime investigation challenges and legal issues on electronic evidence in Ghana. The study examines the existing cybercrime laws and practices in Ghana and makes a comparative study from other jurisdictions. Also, the study draws a survey from the international legal framework on cybercrime and electronic evidence on various methods and procedures that can be used to conduct digital forensic search and seizure of electronic evidence and investigation when cybercrimes occur. Recommendations were made which include formulation of stringent laws, establishing the national Cybercrime investigation Strategy and policies, the establishment of national guidelines for digital evidence collections, develop anti-cybercrime tool-kit for the collection of digital evidence, the establishment of digital forensic training institutions in all regions of Ghana for hands-on skilled based training for law enforcement officers and judges to ensure efficiency in the process of digital forensic investigation and prosecution of cybercrimes in Ghana are given.Police PracticeD. Phil. (Criminal Justice

    Mobile Forensics – The File Format Handbook

    Get PDF
    This open access book summarizes knowledge about several file systems and file formats commonly used in mobile devices. In addition to the fundamental description of the formats, there are hints about the forensic value of possible artefacts, along with an outline of tools that can decode the relevant data. The book is organized into two distinct parts: Part I describes several different file systems that are commonly used in mobile devices. · APFS is the file system that is used in all modern Apple devices including iPhones, iPads, and even Apple Computers, like the MacBook series. · Ext4 is very common in Android devices and is the successor of the Ext2 and Ext3 file systems that were commonly used on Linux-based computers. · The Flash-Friendly File System (F2FS) is a Linux system designed explicitly for NAND Flash memory, common in removable storage devices and mobile devices, which Samsung Electronics developed in 2012. · The QNX6 file system is present in Smartphones delivered by Blackberry (e.g. devices that are using Blackberry 10) and modern vehicle infotainment systems that use QNX as their operating system. Part II describes five different file formats that are commonly used on mobile devices. · SQLite is nearly omnipresent in mobile devices with an overwhelming majority of all mobile applications storing their data in such databases. · The second leading file format in the mobile world are Property Lists, which are predominantly found on Apple devices. · Java Serialization is a popular technique for storing object states in the Java programming language. Mobile application (app) developers very often resort to this technique to make their application state persistent. · The Realm database format has emerged over recent years as a possible successor to the now ageing SQLite format and has begun to appear as part of some modern applications on mobile devices. · Protocol Buffers provide a format for taking compiled data and serializing it by turning it into bytes represented in decimal values, which is a technique commonly used in mobile devices. The aim of this book is to act as a knowledge base and reference guide for digital forensic practitioners who need knowledge about a specific file system or file format. It is also hoped to provide useful insight and knowledge for students or other aspiring professionals who want to work within the field of digital forensics. The book is written with the assumption that the reader will have some existing knowledge and understanding about computers, mobile devices, file systems and file formats

    Defining Atomicity (and Integrity) for Snapshots of Storage in Forensic Computing

    Get PDF
    The acquisition of data from main memory or from hard disk storage is usually one of the first steps in a forensic investigation. We revisit the discussion on quality criteria for “forensically sound” acquisition of such storage and propose a new way to capture the intent to acquire an instantaneous snapshot from a single target system. The idea of our definition is to allow a certain flexibility into when individual portions of memory are acquired, but at the same time require being consistent with causality (i.e., cause/effect relations). Our concept is much stronger than the original notion of atomicity defined by Vömel and Freiling (2012) but still attainable using copy-on-write mechanisms. As a minor result, we also fix a conceptual problem within the original definition of integrity

    DRONE DELIVERY OF CBNRECy – DEW WEAPONS Emerging Threats of Mini-Weapons of Mass Destruction and Disruption (WMDD)

    Get PDF
    Drone Delivery of CBNRECy – DEW Weapons: Emerging Threats of Mini-Weapons of Mass Destruction and Disruption (WMDD) is our sixth textbook in a series covering the world of UASs and UUVs. Our textbook takes on a whole new purview for UAS / CUAS/ UUV (drones) – how they can be used to deploy Weapons of Mass Destruction and Deception against CBRNE and civilian targets of opportunity. We are concerned with the future use of these inexpensive devices and their availability to maleficent actors. Our work suggests that UASs in air and underwater UUVs will be the future of military and civilian terrorist operations. UAS / UUVs can deliver a huge punch for a low investment and minimize human casualties.https://newprairiepress.org/ebooks/1046/thumbnail.jp

    Software supply chain monitoring in containerised open-source digital forensics and incident response tools

    Get PDF
    Abstract. Legal context makes software development challenging for the tool-oriented Digital Forensics and Incident Response (DFIR) field. Digital evidence must be complete, accurate, reliable, and acquirable in reproducible methods in order to be used in court. However, the lack of sufficient software quality is a well-known problem in this context. The popularity of Open-source Software (OSS) based development has increased the tool availability on different channels, highlighting their varying quality. The lengthened software supply chain has introduced additional factors affecting the tool quality and control over the use of the exact software version. Prior research on the quality level has primarily targeted the fundamental codebase of the tool, not the underlying dependencies. There is no research about the role of the software supply chain for quality factors in the DFIR context. The research in this work focuses on the container-based package ecosystem, where the case study includes 51 maintained open-source DFIR tools published as Open Container Initiative (OCI) containers. The package ecosystem was improved, and an experimental system was implemented to monitor upstream release version information and provide it for both package maintainers and end-users. The system guarantees that the described tool version matches the actual version of the tool package, and all information about tool versions is available. The primary purpose is to bring more control over the packages and support the reproducibility and documentation requirement of the investigations while also helping with the maintenance work. The tools were also monitored and maintained for six months to observe software dependency-related factors affecting the tool functionality between different versions. After that period, the maintenance was halted for additional six months, and the tool’s current package version was rebuilt to limit gathered information for the changed dependencies. A significant amount of different built time and runtime failures were discovered, which have either prevented or hindered the tool installation or significantly affected the tool used in the investigation process. Undocumented, changed or too new environment-related dependencies were the significant factors leading to tool failures. These findings support known software dependency-related problems. The nature of the failures suggests that tool package maintainers are required to possess a prominent level of various kinds of skills for making operational tool packages, and maintenance is an effort-intensive job. If the investigator does not have similar skills and there is a dependency-related failure present in the software, the software may not be usable.Ohjelmistotoimitusketjun seuranta kontitetuissa avoimen lähdekoodin digitaaliforensiikan ja tietoturvapoikkeamien reagoinnin työkaluissa. Tiivistelmä. Oikeudellinen asiayhteys tekee ohjelmistokehityksestä haasteellista työkalupainotteiselle digitaaliforensiikalle ja tietoturvapoikkeamiin reagoinnille (DFIR). Digitaalisen todistusaineiston on oltava kokonaista, täsmällistä, luotettavaa ja hankittavissa toistettavilla menetelmillä, jotta sitä voidaan käyttää tuomioistuimessa. Laadun puute on kuitenkin tässä yhteydessä tunnettu ongelma. Avoimeen lähdekoodin perustuva ohjelmistokehitys on kasvattanut suosiotaan, mikä on luonnollisesti lisännyt työkalujen saatavuutta eri kanavilla, korostaen niiden vaihtelevaa laatua. Ohjelmistotoimitusketjun pidentyminen on tuonut mukanaan työkalujen laatuun ja täsmällisen ohjelmistoversion hallintaan vaikuttavia lisätekijöitä. Laatutasoa koskevassa aikaisemmassa tutkimuksessa on keskitytty pääasiassa työkalun olennaiseen koodipohjaan; ei sen taustalla oleviin riippuvuuksiin. Ohjelmistotoimitusketjun merkityksestä laadullisiin tekijöihin ei ole olemassa tutkimusta DFIR-asiayhteydessä. Tämän työn tutkimuksessa keskitytään konttipohjaiseen pakettiekosysteemiin, missä tapaustutkimuksen kohteena on 51 ylläpidettyä avoimen lähdekoodin DFIR-työkalua, jotka julkaistaan ns. OCI-kontteina. Työssä parannettiin pakettiekosysteemiä ja toteutettiin kokeellinen järjestelmä, jolla seurattiin julkaisuversiotietoja ja tarjottiin niitä sekä pakettien ylläpitäjille että loppukäyttäjille. Järjestelmä takaa, että kuvattu työkaluversio vastaa työkalupaketin todellista versiota, ja kaikki tieto työkaluversioista on saatavilla. Ensisijaisena tarkoituksena oli lisätä ohjelmistopakettien hallintaa ja tukea tutkintojen toistettavuus- ja dokumentointivaatimusta, kuten myös auttaa pakettien ylläpitotyössä. Työssä myös seurattiin ja ylläpidettiin työkaluja kuuden kuukauden ajan sellaisten ohjelmistoriippuvuuksien aiheuttamien tekijöiden tunnistamiseksi, jotka vaikuttavat työkalun toimivuuteen eri versioiden välillä. Lisäksi odotettiin vielä kuusi kuukautta ilman ylläpitoa, ja työkalun nykyinen pakettiversio rakennettiin uudelleen, jotta kerätty tieto voitiin rajoittaa vain muuttuneisiin riippuvuuksiin. Työn aikana löydettiin huomattava määrä erilaisia rakennusaika- ja suoritusaikavirheitä, mitkä ovat joko estäneet tai haitanneet työkalun asennusta, tai muuten vaikuttaneet merkittävästi tutkinnassa käytettyyn työkaluun. Dokumentoimattomat, muuttuneet tai liian uudet ympäristöriippuvuudet olivat merkittäviä työkaluvirheisiin johtaneita tekijöitä. Nämä löydökset tukevat ennestään tunnettuja ohjelmistoriippuvuusongelmia. Virheiden luonteesta voidaan päätellä, että työkalujen ylläpitäjiltä vaaditaan paljon erilaista osaamista toiminnallisten työkalupakettien ylläpitämisessä, ja ylläpitäminen vaatii paljon vaivaa. Jos tutkijalla ei ole vastaavia taitoja ja ohjelmistossa on riippuvuuksiin liittyvä virhe, ohjelmisto saattaa olla käyttökelvoton

    An Integrative Analytical Framework for Internet of Things Security, Forensics and Intelligence

    Full text link
    The Internet of things (IoT) has recently become an important research topic because it revolutionises our everyday life through integrating various sensors and objects to communicate directly without human intervention. IoT technology is expected to offer very promising solutions for many areas. In this thesis we focused on the crime investigation and crime prevention, which may significantly contribute to human well-being and safety. Our primary goals are to reduce the time of crime investigation, minimise the time of incident response and to prevent future crimes using collected data from smart devices. This PhD thesis consists of three distinct but related projects to reach the research goal. The main contributions can be summarised as: • A multi-level access control framework, presented in Chapter 3. This could be used to secure any collected and shared data. We decided to have this as our first contribution as it is not realistic to use data that could be altered in our prediction model or as evidence. We chose healthcare data collected from ambient sensors and uploaded to cloud storage as an example for our framework as this data is collected from multiple sources and is used by different parties. The access control system regulates access to data by defining policy attributes over healthcare professional groups and data classes classifications. The proposed access control system contains policy model, architecture model and a methodology to classify data classes and healthcare professional groups. • An investigative framework, that was discussed in Chapter 4, which contains a multi-phased process flow that coordinates different roles and tasks in IoT related-crime investigation. The framework identifies digital information sources and captures all potential evidence from smart devices in a way that guarantee potential evidence is not altered so it can be admissible in a court of law. • A deep learning multi-view model, which we demonstrated in Chapter 5, that explores the relationship between tweets, weather (a type of sensory data) and crime rate, for effective crime prediction. This contribution is motivated by the need to utilise police force deployment correctly to be present at the right times. Both the proposed investigative framework and the predictive model were evaluated and tested, and the results of these evaluations are presented in the thesis. The proposed framework and model contribute significantly to the field of crime investigation and crime prediction. We believe their application would provide higher admissibility evidence, more efficient investigations, and optimum ways to utilise law enforcement deployment based on crime rate prediction using collected sensory data

    Digital Forensics AI: on Practicality, Optimality, and Interpretability of Digital Evidence Mining Techniques

    Get PDF
    Digital forensics as a field has progressed alongside technological advancements over the years, just as digital devices have gotten more robust and sophisticated. However, criminals and attackers have devised means for exploiting the vulnerabilities or sophistication of these devices to carry out malicious activities in unprecedented ways. Their belief is that electronic crimes can be committed without identities being revealed or trails being established. Several applications of artificial intelligence (AI) have demonstrated interesting and promising solutions to seemingly intractable societal challenges. This thesis aims to advance the concept of applying AI techniques in digital forensic investigation. Our approach involves experimenting with a complex case scenario in which suspects corresponded by e-mail and deleted, suspiciously, certain communications, presumably to conceal evidence. The purpose is to demonstrate the efficacy of Artificial Neural Networks (ANN) in learning and detecting communication patterns over time, and then predicting the possibility of missing communication(s) along with potential topics of discussion. To do this, we developed a novel approach and included other existing models. The accuracy of our results is evaluated, and their performance on previously unseen data is measured. Second, we proposed conceptualizing the term “Digital Forensics AI” (DFAI) to formalize the application of AI in digital forensics. The objective is to highlight the instruments that facilitate the best evidential outcomes and presentation mechanisms that are adaptable to the probabilistic output of AI models. Finally, we enhanced our notion in support of the application of AI in digital forensics by recommending methodologies and approaches for bridging trust gaps through the development of interpretable models that facilitate the admissibility of digital evidence in legal proceedings
    corecore