12 research outputs found

    Research Toward a Partially-Automated, and Crime Specific Digital Triage Process Model

    Get PDF
    The digital forensic process as traditionally laid out begins with the collection, duplication, and authentication of every piece of digital media prior to examination. These first three phases of the digital forensic process are by far the most costly. However, complete forensic duplication is standard practice among digital forensic laboratories. The time it takes to complete these stages is quickly becoming a serious problem. Digital forensic laboratories do not have the resources and time to keep up with the growing demand for digital forensic examinations with the current methodologies. One solution to this problem is the use of pre-examination techniques commonly referred to as digital triage. Pre-examination techniques can assist the examiner with intelligence that can be used to prioritize and lead the examination process. This work discusses a proposed model for digital triage that is currently under development at Mississippi State University

    The Proteogenomic Mapping Tool

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>High-throughput mass spectrometry (MS) proteomics data is increasingly being used to complement traditional structural genome annotation methods. To keep pace with the high speed of experimental data generation and to aid in structural genome annotation, experimentally observed peptides need to be mapped back to their source genome location quickly and exactly. Previously, the tools to do this have been limited to custom scripts designed by individual research groups to analyze their own data, are generally not widely available, and do not scale well with large eukaryotic genomes.</p> <p>Results</p> <p>The Proteogenomic Mapping Tool includes a Java implementation of the Aho-Corasick string searching algorithm which takes as input standardized file types and rapidly searches experimentally observed peptides against a given genome translated in all 6 reading frames for exact matches. The Java implementation allows the application to scale well with larger eukaryotic genomes while providing cross-platform functionality.</p> <p>Conclusions</p> <p>The Proteogenomic Mapping Tool provides a standalone application for mapping peptides back to their source genome on a number of operating system platforms with standard desktop computer hardware and executes very rapidly for a variety of datasets. Allowing the selection of different genetic codes for different organisms allows researchers to easily customize the tool to their own research interests and is recommended for anyone working to structurally annotate genomes using MS derived proteomics data.</p

    Accelerating String Set Matching in FPGA Hardware for Bioinformatics Research

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This paper describes techniques for accelerating the performance of the string set matching problem with particular emphasis on applications in computational proteomics. The process of matching peptide sequences against a genome translated in six reading frames is part of a proteogenomic mapping pipeline that is used as a case-study. The Aho-Corasick algorithm is adapted for execution in field programmable gate array (FPGA) devices in a manner that optimizes space and performance. In this approach, the traditional Aho-Corasick finite state machine (FSM) is split into smaller FSMs, operating in parallel, each of which matches up to 20 peptides in the input translated genome. Each of the smaller FSMs is further divided into five simpler FSMs such that each simple FSM operates on a single bit position in the input (five bits are sufficient for representing all amino acids and special symbols in protein sequences).</p> <p>Results</p> <p>This bit-split organization of the Aho-Corasick implementation enables efficient utilization of the limited random access memory (RAM) resources available in typical FPGAs. The use of on-chip RAM as opposed to FPGA logic resources for FSM implementation also enables rapid reconfiguration of the FPGA without the place and routing delays associated with complex digital designs.</p> <p>Conclusion</p> <p>Experimental results show storage efficiencies of over 80% for several data sets. Furthermore, the FPGA implementation executing at 100 MHz is nearly 20 times faster than an implementation of the traditional Aho-Corasick algorithm executing on a 2.67 GHz workstation.</p

    An Approach to Model Network Exploitations Using Exploitation Graphs

    No full text
    In this article, a modeling process is defined to address challenges in analyzing attack scenarios and mitigating vulnerabilities in networked environments. Known system vulnerability data, system configuration data, and vulnerability scanner results are considered to create exploitation graphs (egraphs) that are used to represent attack scenarios. Experiments carried out in a cluster computing environment showed the usefulness of proposed techniques in providing in-depth attack scenario analyses for security engineering. Critical vulnerabilities can be identified by employing graph algorithms. Several factors were used to measure the difficulty in executing an attack. A cost/benefit analysis was used for more accurate quantitative analysis of attack scenarios.The authors also show how the attack scenario analyses better help deployment of security products and design of network topologies

    An Approach to Model Network Exploitations Using Exploitation Graphs

    No full text
    In this article, a modeling process is defined to address challenges in analyzing attack scenarios and mitigating vulnerabilities in networked environments. Known system vulnerability data, system configuration data, and vulnerability scanner results are considered to create exploitation graphs (egraphs) that are used to represent attack scenarios. Experiments carried out in a cluster computing environment showed the usefulness of proposed techniques in providing in-depth attack scenario analyses for security engineering. Critical vulnerabilities can be identified by employing graph algorithms. Several factors were used to measure the difficulty in executing an attack. A cost/benefit analysis was used for more accurate quantitative analysis of attack scenarios.The authors also show how the attack scenario analyses better help deployment of security products and design of network topologies
    corecore