754 research outputs found

    Auditing file system permissions using Association Rule Mining

    Get PDF
    Identifying irregular file system permissions in large, multi-user systems is challenging due to the complexity of gaining structural understanding from large volumes of permission information. This challenge is exacerbated when file systems permissions are allocated in an ad-hoc manner when new access rights are required, and when access rights become redundant as users change job roles or terminate employment. These factors make it challenging to identify what can be classed as an irregular file system permission, as well as identifying if they are irregular and exposing a vulnerability. The current way of finding such irregularities is by performing an exhaustive audit of the permission distribution; however, this requires expert knowledge and a significant amount of time. In this paper a novel method of modelling file system permissions which can be used by association rule mining techniques to identify irregular permissions is presented. This results in the creation of object-centric model as a by-product. This technique is then implemented and tested on Microsoft's New Technology File System permissions (NTFS). Empirical observations are derived by making comparisons with expert knowledge to determine the effectiveness of the proposed technique on five diverse real-world directory structures extracted from different organisations. The results demonstrate that the technique is able to correctly identify irregularities with an average accuracy rate of 91%, minimising the reliance on expert knowledge. Experiments are also performed on synthetic directory structures which demonstrate an accuracy rate of 95% when the number of irregular permissions constitutes 1% of the total number. This is a significant contribution as it creates the possibility of identifying vulnerabilities without prior knowledge of how to file systems permissions are implemented within a directory structure

    Identification of irregularities and allocation suggestion of relative file system permissions

    Get PDF
    It is well established that file system permissions in large, multi-user environments can be audited to identify vulnerabilities with respect to what is regarded as standard practice. For example, identifying that a user has an elevated level of access to a system directory which is unnecessary and introduces a vulnerability. Similarly, the allocation of new file system permissions can be assigned following the same standard practices. On the contrary, and less well established, is the identification of potential vulnerabilities as well as the implementation of new permissions with respect to a system's current access control implementation. Such tasks are heavily reliant on expert interpretation. For example, the assigned relationship between users and groups, directories and their parents, and the allocation of permissions on file system resources all need to be carefully considered. This paper presents the novel use of statistical analysis to establish independence and homogeneity in allocated file system permissions. This independence can be interpreted as potential anomalies in a system's implementation of access control. The paper then presents the use of instance-based learning to suggest the allocation of new permissions conforming to a system's current implementation structure. Following this, both of the presented techniques are then included in a tool for interacting with Microsoft's New Technology File System (NTFS) permissions. This involves experimental analysis on six different NTFS directory structures within different organisations. The effectiveness of the developed technique is then established through analysing the true positive and true negative values. The presented results demonstrate the potential of the proposed techniques for overcoming complexities with real-world file system administratio

    Access control in semantic information systems

    Get PDF
    Access control has evolved in file systems. Early access control was limited and didn't handle identities. Access control then shifted to develop concepts such as identities. The next progression was the ability to take these identities and use lists to control what those identities can do. At this point we start to see more areas implementing access control such as web information systems. Web information systems has themselves started to raise the profile of semantic information. As semantic information systems start to expand new opportunities in access control become available to be explored. This dissertation introduces an experimental file system. The file system explores the concept of utilising metadata in a file system. The metadata is supported through the use of a database system. The introduction of the database enables the use of features such as views within the file system. Databases also provide a rich query language to utilise when nding information. The database aides the development of semantic meaning for the metadata stored. This provides greater meaning to the metadata and enables a platform for rethinking access contro

    A Novel Software Tool for Analysing NT® File System Permissions

    Full text link

    FinalGen revisited: new discoveries

    Get PDF
    Romero’s FINALGEN of 2012 creates designer endgame tables for specific chess positions that feature no more than one non-pawn piece per side. Larger hard discs and faster solid-state discs have extended the reach of this software and encouraged its greater use. Some new discoveries illustrate here what is now feasible and how FINALGEN may be combined with other tools to reach definitive and likely truths

    An Application of Automated Theorem Provers to Computer System Security: The Schematic Protection Model

    Get PDF
    The Schematic Protection Model is specified in SAL and theorems about Take-Grant and New Technology File System schemes are proven. Arbitrary systems can be specified in SPM and analyzed. This is the first known automated analysis of SPM specifications in a theorem prover. The SPM specification was created in such a way that new specifications share the underlying framework and are configurable within the specifications file alone. This allows new specifications to be created with ease as demonstrated by the four unique models included within this document. This also allows future users to more easily specify models without recreating the framework. The built-in modules of SAL provided the needed support to make the model flexible and entities asynchronous. This flexibility allows for the number of entities to be dynamic and to meet the needs of different specifications. The models analyzed in this research demonstrate the validity of the specification and its application to real-world systems

    On the genesis of computer forensis

    Get PDF
    This thesis presents a coherent set of research contributions to the new discipline of computer forensis. It analyses emergence of computer forensis and defines challenges facing this discipline, carries forward research advances in conventional methodology, introduces novel approach to using virtual environments in forensis, and systemises the computer forensis body of knowledge leading to the establishment of tertiary curriculum. The emergence of computer forensis as a separate discipline of science was triggered by evolution and growth of computer crime. Computer technology reached a stage when a conventional, mechanistic approach to collecting and analysing data is insufficient: the existing methodology must be formalised, and embrace technologies and methods that will enable the inclusion of transient data and live systems analysis. Further work is crucial to incorporate advances in related disciplines like computer security and information systems audit, as well as developments in operating systems to make computer forensics issues inherent in their design. For example: it is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. The analysis of permanent data storage is fundamental to computer forensics practice. There is very little finalised, and a lot still to be discovered in the conventional computer forensics methodology. This thesis contributes to formalisation and improved integrity of forensic handling of data storage by: formalising methods for data collection and analysis in NTFS (Microsoft file system) environment: presenting safe methodology for handling data backups in order to avoid information loss where Alternate Data Streams (ADS) are present: formalising methods of hiding and extracting hidden and encrypted data. A significant contribution of this thesis is in the field of application of virtualisation, or simulation of the computer in the virtual environment created by the underlying hardware and software, to computer forensics practice. Computer systems are not easily analysed for forensic purpose, and it is demonstrated that virtualisation applied in computer forensics allows for more efficient and accurate identification and analysis of the evidence. A new method is proposed where two environments used in parallel can bring faster and verifiable results not dependent on proprietary, close source tools and may lead to gradual shift from commercial Windows software to open source software (OSS). The final contribution of this thesis is systemising the body of knowledge in computer forensics, which is a necessary condition for it to become an established discipline of science. This systemisation led to design and development of tertiary curriculum in computer forensics illustrated here with a case study of computer forensics major for Bachelor of Computer Science at University of Western Sydney. All genesis starts as an idea. A natural part of scientific research process is replacing previous assumptions, concepts, and practices with new ones which better approximate the truth. This thesis advances computer forensis body of knowledge in the areas which are crucial to further development of this discipline. Please note that the appendices to this thesis consist of separately published items which cannot be made available due to copyright restrictions. These items are listed in the PDF attachment for reference purposes

    Analyzing storage media of digital camera

    Get PDF
    Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.published_or_final_versionThe 2009 International Workshop on Forensics for Future Generation Communication Environments (F2GC-09) in conjunction with CSA 2009, Jeju Island, Korea, 10-12 December 2009. In Proceedings of CSA, 2009, p. 1-

    A Survey On Various Methods To Detect Forgery And Computer Crime In Transaction Database

    Get PDF
    Abstract: A computer forensic method can be used for detecting the different types of forgeries and computer crime. Forgeries and computer crime are the most major concern of the digital world. Lots of techniques and methods have been used to find a proper solution to these problems. Nowadays, digital forensics are an important topic for research articles. In this paper a general survey has been carried out for different methods used in computer forensics to track the evidences which can be useful for detecting the computer crime and forgery. Forensic tools can be used for making any changes to data or tampering of data. Different rules sets or methods are defined to detect the various errors regarding the changes and the tampering of the data in different windows file system. Digital evidence can also be used to detect forgery or computer crime
    • …
    corecore