165 research outputs found

    On Efficiency of Artifact Lookup Strategies in Digital Forensics

    Get PDF
    In recent years different strategies have been proposed to handle the problem of ever-growing digital forensic databases. One concept to deal with this data overload is data reduction, which essentially means to separate the wheat from the chaff, e.g., to filter in forensically relevant data. A prominent technique in the context of data reduction are hash-based solutions. Data reduction is achieved because hash values (of possibly large data input) are much smaller than the original input. Today\u27s approaches of storing hash-based data fragments reach from large scale multithreaded databases to simple Bloom filter representations. One main focus was put on the field of approximate matching, where sorting is a problem due to the fuzzy nature of the approximate hashes. A crucial step during digital forensic analysis is to achieve fast query times during lookup (e.g., against a blacklist), especially in the scope of small or ordinary resource availability. However, a comparison of different database and lookup approaches is considerably hard, as most techniques partially differ in considered use-case and integrated features, respectively. In this work we discuss, reassess and extend three widespread lookup strategies suitable for storing hash-based fragments: (1) Hash database for hash-based carving (hashdb), (2) hierarchical Bloom filter trees (hbft) and (3) flat hash maps (fhmap). We outline the capabilities of the different approaches, integrate new extensions, discuss possible features and perform a detailed evaluation with a special focus on runtime efficiency. Our results reveal major advantages for fhmap in case of runtime performance and applicability. Hbft showed a comparable runtime efficiency in case of lookups, but hbft suffers from pitfalls with respect to extensibility and maintenance. Finally, hashdb performs worst in case of a single core environment in all evaluation scenarios. However, hashdb is the only candidate which offers full parallelization capabilities, transactional features, and a Single-level storage

    Análisis forense digital y su papel en la promoción del enjuiciamiento penal

    Get PDF
    Digital forensics is essentially synonymous with computer forensics, but the term "digital forensics" is generally used for the technical review of all devices that have the ability to store data. Today, digital criminology is challenged in cloud computing. The first problem is to understand why and how criminal and social actions are so unique and complex. The second problem is the lack of accurate scientific tools for forensic medicine in cyberspace. So far, no complete tools or explanations for criminology have been provided in the virtual infrastructure, and no training for security researchers has been provided in detail. Therefore, the author of the present descriptive-analytical research is based on library resources and using fish taking tools. To investigate suspicious cases related to cyberspace, criminologists must be well-equipped with technical and legal issues to deal with. In this article, we analyze digital criminology and its role in judicial law. The benefit of computer forensic knowledge is not only an indispensable necessity for security and judicial institutions, but also professional users and owners of computer systems, systems and networks must be fully aware of and properly comply with its legal and technical requirements.El análisis forense digital es esencialmente sinónimo de análisis forense informático, pero el término "análisis forense digital" se utiliza generalmente para la revisión técnica de todos los dispositivos que tienen la capacidad de almacenar datos. Hoy en día, la criminología digital se enfrenta al desafío de la computación en la nube. El primer problema es comprender por qué y cómo las acciones criminales y sociales son tan únicas y complejas. El segundo problema es la falta de herramientas científicas precisas para la medicina forense en el ciberespacio. Hasta ahora, no se han proporcionado herramientas completas o explicaciones para la criminología en la infraestructura virtual, y no se ha proporcionado ninguna formación detallada a los investigadores de seguridad. Por lo tanto, el autor de la presente investigación descriptivo-analítica se basa en los recursos de la biblioteca y en el uso de herramientas de pesca. Para investigar casos sospechosos relacionados con el ciberespacio, los criminólogos deben estar bien equipados con los problemas técnicos y legales que abordar. En este artículo analizamos la criminología digital y su papel en el derecho judicial. El beneficio del conocimiento forense informático no solo es una necesidad indispensable para las instituciones de seguridad y judiciales, sino que también los usuarios profesionales y propietarios de sistemas, sistemas y redes informáticas deben conocer y cumplir debidamente sus requisitos legales y técnicos

    Automated Digital Forensic Triage: Rapid Detection of Anti-Forensic Tools

    Get PDF
    We live in the information age. Our world is interconnected by digital devices and electronic communication. As such, criminals are finding opportunities to exploit our information rich electronic data. In 2014, the estimated annual cost from computer-related crime was more than 800 billion dollars. Examples include the theft of intellectual property, electronic fraud, identity theft and the distribution of illicit material. Digital forensics grew out of necessity to combat computer crime and involves the investigation and analysis of electronic data after a suspected criminal act. Challenges in digital forensics exist due to constant changes in technology. Investigation challenges include exponential growth in the number of cases and the size of targets; for example, forensic practitioners must analyse multi-terabyte cases comprised of numerous digital devices. A variety of applied challenges also exist, due to continual technological advancements; for example, anti-forensic tools, including the malicious use of encryption or data wiping tools, hinder digital investigations by hiding or removing the availability of evidence. In response, the objective of the research reported here was to automate the effective and efficient detection of anti-forensic tools. A design science research methodology was selected as it provides an applied research method to design, implement and evaluate an innovative Information Technology (IT) artifact to solve a specified problem. The research objective require that a system be designed and implemented to perform automated detection of digital artifacts (e.g., data files and Windows Registry entries) on a target data set. The goal of the system is to automatically determine if an anti-forensic tool is present, or absent, in order to prioritise additional in-depth investigation. The system performs rapid forensic triage, suitable for execution against multiple investigation targets, providing an analyst with high-level information regarding potential malicious anti-forensic tool usage. The system is divided into two main stages: 1) Design and implementation of a solution to automate creation of an application profile (application software reference set) of known unique digital artifacts; and 2) Digital artifact matching between the created reference set and a target data set. Two tools were designed and implemented: 1) A live differential analysis tool, named LiveDiff, to reverse engineer application software with a specific emphasis on digital forensic requirements; 2) A digital artifact matching framework, named Vestigium, to correlate digital artifact metadata and detect anti-forensic tool presence. In addition, a forensic data abstraction, named Application Profile XML (APXML), was designed to store and distribute digital artifact metadata. An associated Application Programming Interface (API), named apxml.py, was authored to provide automated processing of APXML documents. Together, the tools provided an automated triage system to detect anti-forensic tool presence on an investigation target. A two-phase approach was employed in order to assess the research products. The first phase of experimental testing involved demonstration in a controlled laboratory environment. First, the LiveDiff tool was used to create application profiles for three anti-forensic tools. The automated data collection and comparison procedure was more effective and efficient than previous approaches. Two data reduction techniques were tested to remove irrelevant operating system noise: application profile intersection and dynamic blacklisting were found to be effective in this regard. Second, the profiles were used as input to Vestigium and automated digital artifact matching was performed against authored known data sets. The results established the desired system functionality and demonstration then led to refinements of the system, as per the cyclical nature of design science. The second phase of experimental testing involved evaluation using two additional data sets to establish effectiveness and efficiency in a real-world investigation scenario. First, a public data set was subjected to testing to provide research reproducibility, as well as to evaluate system effectiveness in a variety of complex detection scenarios. Results showed the ability to detect anti-forensic tools using a different version than that included in the application profile and on a different Windows operating system version. Both are scenarios where traditional hash set analysis fails. Furthermore, Vestigium was able to detect residual and deleted information, even after a tool had been uninstalled by the user. The efficiency of the system was determined and refinements made, resulting in an implementation that can meet forensic triage requirements. Second, a real-world data set was constructed using a collection of second-hand hard drives. The goal was to test the system using unpredictable and diverse data to provide more robust findings in an uncontrolled environment. The system detected one anti-forensic tool on the data set and processed all input data successfully without error, further validating system design and implementation. The key outcome of this research is the design and implementation of an automated system to detect anti-forensic tool presence on a target data set. Evaluation suggested the solution was both effective and efficient, adhering to forensic triage requirements. Furthermore, techniques not previously utilised in forensic analysis were designed and applied throughout the research: dynamic blacklisting and profile intersection removed irrelevant operating system noise from application profiles; metadata matching methods resulted in efficient digital artifact detection and path normalisation aided full path correlation in complex matching scenarios. The system was subjected to rigorous experimental testing on three data sets that comprised more than 10 terabytes of data. The ultimate outcome is a practically implemented solution that has been executed on hundreds of forensic disk images, thousands of Windows Registry hives, more than 10 million data files, and approximately 50 million Registry entries. The research has resulted in the design of a scalable triage system implemented as a set of computer forensic tools

    Analysis of File Carving Approaches : A Literature Review

    Get PDF
    Home Advances in Cyber Security Conference paper Analysis of File Carving Approaches: A Literature Review Nor Ika Shahirah Ramli, Syifak Izhar Hisham & Gran Badshah Conference paper First Online: 01 January 2022 1262 Accesses 2 Citations Part of the Communications in Computer and Information Science book series (CCIS,volume 1487) Abstract Digital forensics is a crucial process of identifying, conserving, retrieving, evaluating, and documenting digital evidence obtained on computers and other electronic devices. Data restoration and analysis on file systems is one of digital forensic science’s most fundamental practices. There is a lot of research being done in developing file carving approaches and different researches focused on different aspects. With the increasing numbers of literature that are covering this research area, there is a need to review this literature for further reference. A review is carried out reviewing different works of literature covering various aspects of carving approaches from multiple digital data sources including IEEE Xplore, Google Scholar, Web of Science, etc. This analysis is done to consider several perspectives which are the current research direction of the file carving approach, the classification for the file carving approaches, and also the challenges are to be highlighted. Based on the analysis, we are able to state the current state of the art of file carving. We classify the carving approach into five classifications which are general carving, carving by specific file type, carving by structure, carving by the file system, and carving by fragmentation. We are also able to highlight several of the challenges for file carving mentioned in the past research. This study will serve as a reference for scientists to evaluate different strategies and obstacles for carving so that they may choose the suitable carving approaches for their study and also future developments

    Cyber indicators of compromise: a domain ontology for security information and event management

    Get PDF
    It has been said that cyber attackers are attacking at wire speed (very fast), while cyber defenders are defending at human speed (very slow). Researchers have been working to improve this asymmetry by automating a greater portion of what has traditionally been very labor-intensive work. This work is involved in both the monitoring of live system events (to detect attacks), and the review of historical system events (to investigate attacks). One technology that is helping to automate this work is Security Information and Event Management (SIEM). In short, SIEM technology works by aggregating log information, and then sifting through this information looking for event correlations that are highly indicative of attack activity. For example: Administrator successful local logon and (concurrently) Administrator successful remote logon. Such correlations are sometimes referred to as indicators of compromise (IOCs). Though IOCs for network-based data (i.e., packet headers and payload) are fairly mature (e.g., Snort's large rule-base), the field of end-device IOCs is still evolving and lacks any well-defined go-to standard accepted by all. This report addresses ontological issues pertaining to end-device IOCs development, including what they are, how they are defined, and what dominant early standards already exist.http://archive.org/details/cyberindicatorso1094553041Lieutenant, United States NavyApproved for public release; distribution is unlimited

    Advanced Techniques for Improving the Efficacy of Digital Forensics Investigations

    Get PDF
    Digital forensics is the science concerned with discovering, preserving, and analyzing evidence on digital devices. The intent is to be able to determine what events have taken place, when they occurred, who performed them, and how they were performed. In order for an investigation to be effective, it must exhibit several characteristics. The results produced must be reliable, or else the theory of events based on the results will be flawed. The investigation must be comprehensive, meaning that it must analyze all targets which may contain evidence of forensic interest. Since any investigation must be performed within the constraints of available time, storage, manpower, and computation, investigative techniques must be efficient. Finally, an investigation must provide a coherent view of the events under question using the evidence gathered. Unfortunately the set of currently available tools and techniques used in digital forensic investigations does a poor job of supporting these characteristics. Many tools used contain bugs which generate inaccurate results; there are many types of devices and data for which no analysis techniques exist; most existing tools are woefully inefficient, failing to take advantage of modern hardware; and the task of aggregating data into a coherent picture of events is largely left to the investigator to perform manually. To remedy this situation, we developed a set of techniques to facilitate more effective investigations. To improve reliability, we developed the Forensic Discovery Auditing Module, a mechanism for auditing and enforcing controls on accesses to evidence. To improve comprehensiveness, we developed ramparser, a tool for deep parsing of Linux RAM images, which provides previously inaccessible data on the live state of a machine. To improve efficiency, we developed a set of performance optimizations, and applied them to the Scalpel file carver, creating order of magnitude improvements to processing speed and storage requirements. Last, to facilitate more coherent investigations, we developed the Forensic Automated Coherence Engine, which generates a high-level view of a system from the data generated by low-level forensics tools. Together, these techniques significantly improve the effectiveness of digital forensic investigations conducted using them

    A comprehensive survey on Pose-Invariant Face Recognition

    Full text link
    © 2016 ACM. The capacity to recognize faces under varied poses is a fundamental human ability that presents a unique challenge for computer vision systems. Compared to frontal face recognition, which has been intensively studied and has gradually matured in the past few decades, Pose-Invariant Face Recognition (PIFR) remains a largely unsolved problem. However, PIFR is crucial to realizing the full potential of face recognition for real-world applications, since face recognition is intrinsically a passive biometric technology for recognizing uncooperative subjects. In this article, we discuss the inherent difficulties in PIFR and present a comprehensive review of established techniques. Existing PIFR methods can be grouped into four categories, that is, pose-robust feature extraction approaches, multiview subspace learning approaches, face synthesis approaches, and hybrid approaches. The motivations, strategies, pros/cons, and performance of representative approaches are described and compared. Moreover, promising directions for future research are discussed

    An Empirical Investigation of Factors Influencing Knowledge Management System Success

    Get PDF
    Knowledge has been viewed as a critical component for organizations. Consequently, organizations implement Knowledge Management Systems (KMSs) to seek competitive advantages, but they may encounter mixed results. This research draws on previous information system and knowledge management system success-related literature and selects eight factors that are believed to be critical for the successful implementation of a KMS. These factors were derived through a literature search of current KMS success-related literature. The purpose of this study is to identify factors that could have a clear influence on the development and implementation of KMSs. The study presents the empirical examination of a theoretical model of KMS success for predicting system use by law enforcement officers. The research findings were accomplished through a validated questionnaire that surveyed 10 law enforcement officers from various agencies. These results contribute to the literature by empirically supporting the hypothesized relationships between identified success factors and KMS success. Though limited in sample size, this research can serve as a foundation for future studies, which can help identify other factors critical for KMS success. The comprehensive model can be used to undertake further research and thus add value to knowledge management system-based literature. In addition to its theoretical contributions, the study also presents important practical implications through the identification of specific infrastructure capabilities leading to KMS success

    A toolbox for Artificial Intelligence Algorithms in Cyber Attacks Prevention and Detection

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementThis Thesis provides a qualitative view on the usage of AI technology in cybersecurity strategy of businesses. It explores the field of AI technology today, and how it is a good technology to implement into Cyber Security. The Internet and Informational technology have transformed the world of today. There is no doubt that it has created huge opportunities for global economy and humanity. The fact that Businesses of today is thoroughly dependent on the Internet and Information Systems has also exposed new vulnerabilities in terms of cybercrimes performed by a diversity of hackers, criminals, terrorists, the state and the non-state actors. All Public, private companies and government agencies are vulnerable for cybercrimes, none is left fully protected. In the recent years AI and machine learning technology have become essential to information security, since these technologies can analyze swiftly millions of datasets and tracking down a wide range of cyber threats. Alongside With the increasingly growth of automation in businesses, is it realistic that cybersecurity can be removed from human interaction into fully independent AI Applications to cover the businesses Information System Architecture of businesses in the future? This is a very interesting field those resources really need to deep into to be able to fully take advantage of the fully potential of AI technology in the usage in the field of cybersecurity. This thesis will explore the usage of AI algorithms in the prevention and detection of cyberattack in businesses and how to optimize its use. This knowledge will be used to implement a framework and a corresponding hybrid toolbox application that its purpose is be to be useful in every business in terms of strengthening the cybersecurity environment
    • …
    corecore