429 research outputs found

    On the genesis of computer forensis

    Get PDF
    This thesis presents a coherent set of research contributions to the new discipline of computer forensis. It analyses emergence of computer forensis and defines challenges facing this discipline, carries forward research advances in conventional methodology, introduces novel approach to using virtual environments in forensis, and systemises the computer forensis body of knowledge leading to the establishment of tertiary curriculum. The emergence of computer forensis as a separate discipline of science was triggered by evolution and growth of computer crime. Computer technology reached a stage when a conventional, mechanistic approach to collecting and analysing data is insufficient: the existing methodology must be formalised, and embrace technologies and methods that will enable the inclusion of transient data and live systems analysis. Further work is crucial to incorporate advances in related disciplines like computer security and information systems audit, as well as developments in operating systems to make computer forensics issues inherent in their design. For example: it is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. The analysis of permanent data storage is fundamental to computer forensics practice. There is very little finalised, and a lot still to be discovered in the conventional computer forensics methodology. This thesis contributes to formalisation and improved integrity of forensic handling of data storage by: formalising methods for data collection and analysis in NTFS (Microsoft file system) environment: presenting safe methodology for handling data backups in order to avoid information loss where Alternate Data Streams (ADS) are present: formalising methods of hiding and extracting hidden and encrypted data. A significant contribution of this thesis is in the field of application of virtualisation, or simulation of the computer in the virtual environment created by the underlying hardware and software, to computer forensics practice. Computer systems are not easily analysed for forensic purpose, and it is demonstrated that virtualisation applied in computer forensics allows for more efficient and accurate identification and analysis of the evidence. A new method is proposed where two environments used in parallel can bring faster and verifiable results not dependent on proprietary, close source tools and may lead to gradual shift from commercial Windows software to open source software (OSS). The final contribution of this thesis is systemising the body of knowledge in computer forensics, which is a necessary condition for it to become an established discipline of science. This systemisation led to design and development of tertiary curriculum in computer forensics illustrated here with a case study of computer forensics major for Bachelor of Computer Science at University of Western Sydney. All genesis starts as an idea. A natural part of scientific research process is replacing previous assumptions, concepts, and practices with new ones which better approximate the truth. This thesis advances computer forensis body of knowledge in the areas which are crucial to further development of this discipline. Please note that the appendices to this thesis consist of separately published items which cannot be made available due to copyright restrictions. These items are listed in the PDF attachment for reference purposes

    CuFA: A More Formal Definition for Digital Forensic Artifacts

    Get PDF
    The term “artifact” currently does not have a formal definition within the domain of cyber/ digital forensics, resulting in a lack of standardized reporting, linguistic understanding between professionals, and efficiency. In this paper we propose a new definition based on a survey we conducted, literature usage, prior definitions of the word itself, and similarities with archival science. This definition includes required fields that all artifacts must have and encompasses the notion of curation. Thus, we propose using a new term e curated forensic artifact (CuFA) e to address items which have been cleared for entry into a CuFA database (one implementation, the Artifact Genome Project, abbreviated as AGP, is under development and briefly outlined). An ontological model encapsulates these required fields while utilizing a lower-level taxonomic schema. We use the Cyber Observable eXpression (CybOX) project due to its rising popularity and rigorous classifications of forensic objects. Additionally, we suggest some improvements on its integration into our model and identify higher-level location categories to illustrate tracing an object from creation through investigative leads. Finally, a step-wise procedure for researching and logging CuFAs is devised to accompany the model

    Forensic Breach Response in Compliance with GDPR

    Get PDF
    Modifications and new approaches for breach response and forensic investigations for compliance with the General Data Protection Regulation, GDPR, is to be expected in May 2018. This paper brings forth the conclusion that engagement from top management is crucial in order to comply with the GDPR requirements. The importance of having a vision and a strategy assessing the matters of breach response, so that resources can enable procedures for an investigation, is articulated. To enable appropriate countermeasures, a clear understanding of the regulation is essential and presented in terms of severity of risk to the rights and freedoms of an individual. Including required actions to take upon a breach and the time-frame of each obligation. Furthermore, the report discusses an approach to approximate the number of individuals being affected by a breach, through looking at the intrusion point. This is an essential step since every incident report that needs to be communicated to Datainspektionen needs to assess the approximate number of individuals affected. Assessing the effects of an incident through the intrusion point-approach, is an initial step before the forensic analyst may define the exact number of affected individuals.Some of the greatest challenges organizations are faced by today are the information security threats, vulnerabilities and risks that all too often reach the state of an incident. Some may argue, the less detected the better. Reporting incidents in the era of the General Data Protection Regulation, GDPR, appears not to be in organizations favor. They may resemble the incident notification process with raising their hands on the highway, announcing they are driving too fast and would like to have a speeding ticket. Will applied sanctions foster absence of speed indicators, in other words, weak detection systems? Absence of evidence is not evidence of absence. If not reported, sanctions will be higher and individuals might be at risks. Breaches are becoming unavoidable and information that is kept might actually cause damage and personally detrimental impact if leaked. Organizations may face severe reputational and financial impact. GDPR, valid from 25 May 2018 when PUL, the current Swedish privacy protection law, will be abolished, addresses this matter through regulatory challenges. Well-managed breach response could save a company from losing both their customers’ trust and money. Breach notifications should be carried out to the national supervisory authority, Datainspektionen, and when necessary to affected individuals. However, the process of identifying which individuals that should be reported to, what exact records that have been compromised, is commonly underestimated. No matter how good the forensic analyst is, if there are no logs to analyze or if the investigation starts too late, there will be challenges in obtaining the requested information. The organization itself should provide the analyst with the best feasible environment for performing an investigation, providing relevant contacts, information and grant access together with searchable and relevant logs. It is essential to discover the breach in time, to be able to contain it and narrow down the number of affected individuals. This paper investigates the adoption of new and altered obligations in incident response and establishes guidance in accordance with GDPR on how to conduct the procedures for breach notification. The paper brings forth the conclusion that engagement from top-management is crucial. By having an information security vision and strategy enabling a proactive culture is the first fundamental step towards giving the forensic analyst the best feasible environment for identifying what records that have been compromised

    Essentials of forensic accounting

    Get PDF
    https://egrove.olemiss.edu/aicpa_guides/2728/thumbnail.jp

    Using random projections for dimensionality reduction in identifying rogue applications

    Get PDF
    In general, the consumer must depend on others to provide their software solutions. However, this outsourcing of software development has caused it to become more and more abstract as to where the software is actually being developed and by whom, and it poses a potentially large security problem for the consumer as it opens up the possibility for rogue functionality to be injected into an application without the consumer’s knowledge or consent. This begs the question of ‘How do we know that the software we use can be trusted?’ or ‘How can we have assurance that the software we use is doing only the tasks that we ask it to do?’ Traditional methods for thwarting such activities, such as virus detection engines, are far too antiquated for today’s adversary. More sophisticated research needs to be conducted in this area to combat these more technically advanced enemies. To combat the ever increasing problem of rogue applications, this dissertation has successfully applied and extended the information retrieval techniques of n-gram analysis and document similarity and the data mining techniques of dimensionality reduction and attribute extraction. This combination of techniques has generated a more effective Trojan horse, rogue application detection capability tool suite that can detect not only standalone rogue applications but also those that are embedded within other applications. This research provides several major contributions to the field including a unique combination of techniques that have provided a new tool for the administrator’s multi-pronged defense to combat the infestation of rogue applications. Another contribution involves a unique method of slicing the potential rogue applications that has proven to provide a more robust rogue application classifier. Through experimental research this effort has shown that a viable and worthy rogue application detection tool suite can be developed. Experimental results have shown that in some cases as much as a 28% increase in overall accuracy can be achieved when comparing the accepted feature selection practice of mutual information with the feature extraction method presented in this effort called randomized projection

    Designing Monitoring Systems for Continuous Certification of Cloud Services: Deriving Meta-requirements and Design Guidelines

    Get PDF
    Continuous service certification (CSC) involves the consistently gathering and assessing certification-relevant information about cloud service operations to validate whether they continue to adhere to certification criteria. Previous research has proposed test-based CSC methodologies that directly assess the components of cloud service infrastructures. However, test-based certification requires that certification authorities can access the cloud infrastructure, which various issues may limit. To address these challenges, cloud service providers need to conduct monitoring-based CSC; that is, monitor their cloud service infrastructure to gather certification-relevant data by themselves and then provide these data to certification authorities. Nevertheless, we need to better understand how to design monitoring systems to enable cloud service providers to perform such monitoring. By taking a design science perspective, we derive universal meta-requirements and design guidelines for CSC monitoring systems based on findings from five expert focus group interviews with 33 cloud experts and 10 one-to-one interviews with cloud customers. With this study, we expand the current knowledge base regarding CSC and monitoring-based CSC. Our derived design guidelines contribute to the development of CSC monitoring systems and enable monitoring-based CSC that overcomes issues of prior test-based approaches

    E-mail forensic authorship attribution

    Get PDF
    E-mails have become the standard for business as well as personal communication. The inherent security risks within e-mail communication present the problem of anonymity. If an author of an e-mail is not known, the digital forensic investigator needs to determine the authorship of the e-mail using a process that has not been standardised in the e-mail forensic field. This research project examines many problems associated with e-mail communication and the digital forensic domain; more specifically e-mail forensic investigations, and the recovery of legally admissible evidence to be presented in a court of law. The Research Methodology utilised a comprehensive literature review in combination with Design Science which results in the development of an artifact through intensive research. The Proposed E-Mail Forensic Methodology is based on the most current digital forensic investigation process and further validation of the process was established via expert reviews. The opinions of the digital forensic experts were an integral portion of the validation process which adds to the credibility of the study. This was performed through the aid of the Delphi technique. This Proposed E-Mail Forensic Methodology adopts a standardised investigation process applied to an e-mail investigation and takes into account the South African perspective by incorporating various checks with the laws and legislation. By following the Proposed E-mail Forensic Methodology, e-mail forensic investigators can produce evidence that is legally admissible in a court of law

    Auditorí­a asistida por computadora para la recaudación y gestión tributaria

    Get PDF
    Cuidar la calidad de la auditoria para la recaudación y gestión tributaria, es de gran importancia para revelar los problemas existentes en la recaudación tributaria, la gestión de las autoridades fiscales y promover un gobierno fiscal de acuerdo a la ley. El documento analiza la operación basada en la plataforma de auditorí­a asistida por computadora en la recaudación y gestión de impuestos; señala que algunos problemas residen en el proceso. A continuación, se presenta algunas sugerencias para perfeccionar la auditorí­a asistida por computadora en la recaudación de impuestos y la gestión tributaria. Los resultados de la investigación radican en que: Los auditores deben prestar mucha atención a las pistas descubiertas en las auditorí­as asistidas por computadora, leer más los archivos de recaudación de impuestos y de la gestión de las empresas relacionadas, para ir al sitio a verificar la realización, declaración, entrada y atrasos de impuestos sospechosos. La prueba de cumplimiento en la auditorí­a asistida por computador u ordenador es una prueba de la solidez y eficacia del control interno de las autoridades fiscales en el contexto de los actuales sistemas informatizadas

    Digital Preservation Services : State of the Art Analysis

    Get PDF
    Research report funded by the DC-NET project.An overview of the state of the art in service provision for digital preservation and curation. Its focus is on the areas where bridging the gaps is needed between e-Infrastructures and efficient and forward-looking digital preservation services. Based on a desktop study and a rapid analysis of some 190 currently available tools and services for digital preservation, the deliverable provides a high-level view on the range of instruments currently on offer to support various functions within a preservation system.European Commission, FP7peer-reviewe
    corecore