10,301 research outputs found
IoT Forensics: Challenges For The IoA Era
Challenges for IoT-based forensic investigations include the increasing amount of objects of forensic interest, relevance of identified and collected devices, blurry network boundaries, and edgeless networks. As we look ahead to a world of expanding ubiquitous computing, the challenge of forensic processes such as data acquisition (logical and physical) and extraction and analysis of data grows in this space. Containing an IoT breach is increasingly challenging – evidence is no longer restricted to a PC or mobile device, but can be found in vehicles, RFID cards, and smart devices. Through the combination of cloud-native forensics with client-side forensics (forensics for companion devices), we can study and develop the connection to support practical digital investigations and tackle emerging challenges in digital forensics. With the IoT bringing investigative complexity, this enhances challenges for the Internet of Anything (IoA) era. IoA brings anything and everything “online” in a connectedness that generates an explosion of connected devices, from fridges, cars and drones, to smart swarms, smart grids and intelligent buildings. Research to identify methods for performing IoT-based digital forensic analysis is essential. The long-term goal is the development of digital forensic standards that can be used as part of overall IoT and IoA security and aid IoT-based investigations
Using a Goal-Driven Approach in the Investigation of a Questioned Contract
Part 3: FORENSIC TECHNIQUESInternational audienceThis paper presents a systematic process for describing digital forensic investigations. It focuses on forensic goals and anti-forensic obstacles and their operationalization in terms of human and software actions. The paper also demonstrates how the process can be used to capture the various forensic and anti-forensic aspects of a real-world case involving document forgery
Multinational perspectives on information technology from academia and industry
As the term \u27information technology\u27 has many meanings for various stakeholders and continues to evolve, this work presents a comprehensive approach for developing curriculum guidelines for rigorous, high quality, bachelor\u27s degree programs in information technology (IT) to prepare successful graduates for a future global technological society. The aim is to address three research questions in the context of IT concerning (1) the educational frameworks relevant for academics and students of IT, (2) the pathways into IT programs, and (3) graduates\u27 preparation for meeting future technologies. The analysis of current trends comes from survey data of IT faculty members and professional IT industry leaders. With these analyses, the IT Model Curricula of CC2005, IT2008, IT2017, extensive literature review, and the multinational insights of the authors into the status of IT, this paper presents a comprehensive overview and discussion of future directions of global IT education toward 2025
A Case-Based Reasoning Method for Locating Evidence During Digital Forensic Device Triage
The role of triage in digital forensics is disputed, with some practitioners questioning its reliability for identifying evidential data. Although successfully implemented in the field of medicine, triage has not established itself to the same degree in digital forensics. This article presents a novel approach to triage for digital forensics. Case-Based Reasoning Forensic Triager (CBR-FT) is a method for collecting and reusing past digital forensic investigation information in order to highlight likely evidential areas on a suspect operating system, thereby helping an investigator to decide where to search for evidence. The CBR-FT framework is discussed and the results of twenty test triage examinations are presented. CBR-FT has been shown to be a more effective method of triage when compared to a practitioner using a leading commercial application
The Need to Support of Data Flow Graph Visualization of Forensic Lucid Programs, Forensic Evidence, and their Evaluation by GIPSY
Lucid programs are data-flow programs and can be visually represented as data
flow graphs (DFGs) and composed visually. Forensic Lucid, a Lucid dialect, is a
language to specify and reason about cyberforensic cases. It includes the
encoding of the evidence (representing the context of evaluation) and the crime
scene modeling in order to validate claims against the model and perform event
reconstruction, potentially within large swaths of digital evidence. To aid
investigators to model the scene and evaluate it, instead of typing a Forensic
Lucid program, we propose to expand the design and implementation of the Lucid
DFG programming onto Forensic Lucid case modeling and specification to enhance
the usability of the language and the system and its behavior. We briefly
discuss the related work on visual programming an DFG modeling in an attempt to
define and select one approach or a composition of approaches for Forensic
Lucid based on various criteria such as previous implementation, wide use,
formal backing in terms of semantics and translation. In the end, we solicit
the readers' constructive, opinions, feedback, comments, and recommendations
within the context of this short discussion.Comment: 11 pages, 7 figures, index; extended abstract presented at VizSec'10
at http://www.vizsec2010.org/posters ; short paper accepted at PST'1
PRECEPT: A Framework for Ethical Digital Forensics Investigations.
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Cyber-enabled crimes are on the increase, and law enforcement has had to expand many of their detecting activities into the digital domain. As such, the field of digital forensics has become far more sophisticated over the years and is now able to uncover even more evidence that can be used to support prosecution of cyber criminals in a court of law. Governments, too, have embraced the ability to track suspicious individuals in the online world. Forensics investigators are driven to gather data exhaustively, being under pressure to provide law enforcement with sufficient evidence to secure a conviction.
Yet, there are concerns about the ethics and justice of untrammeled investigations on a number of levels. On an organizational level, unconstrained investigations could interfere with, and damage, the organization’s right to control the disclosure of their intellectual capital. On an individual level, those being investigated could easily have their legal privacy rights violated by forensics investigations. On a societal level, there might be a sense of injustice at the perceived inequality of current practice in this domain.
This paper argues the need for a practical, ethically-grounded approach to digital forensic investigations, one that acknowledges and respects the privacy rights of individuals and the intellectual capital disclosure rights of organisations, as well as acknowledging the needs of law enforcement. We derive a set of ethical guidelines, then map these onto a forensics investigation framework. We subjected the framework to expert review in two stages, refining the framework after each stage. We conclude by proposing the refined ethically-grounded digital forensics investigation framework. Our treatise is primarily UK based, but the concepts presented here have international relevance and applicability.
In this paper, the lens of justice theory is used to explore the tension that exists between the needs of digital forensic investigations into cybercrimes on the one hand, and, on the other, individuals’ rights to privacy and organizations’ rights to control intellectual capital disclosure.
The investigation revealed a potential inequality between the practices of digital forensics investigators and the rights of other stakeholders. That being so, the need for a more ethically-informed approach to digital forensics investigations, as a remedy, is highlighted, and a framework proposed to provide this.
Our proposed ethically-informed framework for guiding digital forensics investigations suggest a way of re-establishing the equality of the stakeholders in this arena, and ensuring that the potential for a sense of injustice is reduced.
Justice theory is used to highlight the difficulties in squaring the circle between the rights and expectations of all stakeholders in the digital forensics arena. The outcome is the forensics investigation guideline, PRECEpt: Privacy-Respecting EthiCal framEwork, which provides the basis for a re-aligning of the balance between the requirements and expectations of digital forensic investigators on the one hand, and individual and organizational expectations and rights, on the other
Recommended from our members
Multi-aspect, robust, and memory exclusive guest os fingerprinting
Precise fingerprinting of an operating system (OS) is critical to many security and forensics applications in the cloud, such as virtual machine (VM) introspection, penetration testing, guest OS administration, kernel dump analysis, and memory forensics. The existing OS fingerprinting techniques primarily inspect network packets or CPU states, and they all fall short in precision and usability. As the physical memory of a VM always exists in all these applications, in this article, we present OS-Sommelier+, a multi-aspect, memory exclusive approach for precise and robust guest OS fingerprinting in the cloud. It works as follows: given a physical memory dump of a guest OS, OS-Sommelier+ first uses a code hash based approach from kernel code aspect to determine the guest OS version. If code hash approach fails, OS-Sommelier+ then uses a kernel data signature based approach from kernel data aspect to determine the version. We have implemented a prototype system, and tested it with a number of Linux kernels. Our evaluation results show that the code hash approach is faster but can only fingerprint the known kernels, and data signature approach complements the code signature approach and can fingerprint even unknown kernels
A Novel Latin Square Image Cipher
In this paper, we introduce a symmetric-key Latin square image cipher (LSIC)
for grayscale and color images. Our contributions to the image encryption
community include 1) we develop new Latin square image encryption primitives
including Latin Square Whitening, Latin Square S-box and Latin Square P-box ;
2) we provide a new way of integrating probabilistic encryption in image
encryption by embedding random noise in the least significant image bit-plane;
and 3) we construct LSIC with these Latin square image encryption primitives
all on one keyed Latin square in a new loom-like substitution-permutation
network. Consequently, the proposed LSIC achieve many desired properties of a
secure cipher including a large key space, high key sensitivities, uniformly
distributed ciphertext, excellent confusion and diffusion properties,
semantically secure, and robustness against channel noise. Theoretical analysis
show that the LSIC has good resistance to many attack models including
brute-force attacks, ciphertext-only attacks, known-plaintext attacks and
chosen-plaintext attacks. Experimental analysis under extensive simulation
results using the complete USC-SIPI Miscellaneous image dataset demonstrate
that LSIC outperforms or reach state of the art suggested by many peer
algorithms. All these analysis and results demonstrate that the LSIC is very
suitable for digital image encryption. Finally, we open source the LSIC MATLAB
code under webpage https://sites.google.com/site/tuftsyuewu/source-code.Comment: 26 pages, 17 figures, and 7 table
- …