729 research outputs found

    Make Research Data Public? -- Not Always so Simple: A Dialogue for Statisticians and Science Editors

    Get PDF
    Putting data into the public domain is not the same thing as making those data accessible for intelligent analysis. A distinguished group of editors and experts who were already engaged in one way or another with the issues inherent in making research data public came together with statisticians to initiate a dialogue about policies and practicalities of requiring published research to be accompanied by publication of the research data. This dialogue carried beyond the broad issues of the advisability, the intellectual integrity, the scientific exigencies to the relevance of these issues to statistics as a discipline and the relevance of statistics, from inference to modeling to data exploration, to science and social science policies on these issues.Comment: Published in at http://dx.doi.org/10.1214/10-STS320 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Database forensic investigation process models: a review

    Get PDF
    Database Forensic Investigation (DBFI) involves the identification, collection, preservation, reconstruction, analysis, and reporting of database incidents. However, it is a heterogeneous, complex, and ambiguous field due to the variety and multidimensional nature of database systems. A small number of DBFI process models have been proposed to solve specific database scenarios using different investigation processes, concepts, activities, and tasks as surveyed in this paper. Specifically, we reviewed 40 proposed DBFI process models for RDBMS in the literature to offer up- to-date and comprehensive background knowledge on existing DBFI process model research, their associated challenges, issues for newcomers, and potential solutions for addressing such issues. This paper highlights three common limitations of the DBFI domain, which are: 1) redundant and irrelevant investigation processes; 2) redundant and irrelevant investigation concepts and terminologies; and 3) a lack of unified models to manage, share, and reuse DBFI knowledge. Also, this paper suggests three solutions for the discovered limitations, which are: 1) propose generic DBFI process/model for the DBFI field; 2) develop a semantic metamodeling language to structure, manage, organize, share, and reuse DBFI knowledge; and 3) develop a repository to store and retrieve DBFI field knowledge

    A generic database forensic investigation process model

    Get PDF
    Database Forensic investigation is a domain which deals with database contents and their metadata to reveal malicious activities on database systems. Even though it is still new, but due to the overwhelming challenges and issues in the domain, this makes database forensic become a fast growing and much sought after research area. Based on observations made, we found that database forensic suffers from having a common standard which could unify knowledge of the domain. Therefore, through this paper, we present the use of Design Science Research (DSR) as a research methodology to develop a Generic Database Forensic Investigation Process Model (DBFIPM). From the creation of DBFIPM, five common forensic investigation processes have been proposed namely, the i) identification, ii) collection, iii) preservation, iv) analysis and v) presentation process. From the DBFIPM, it allows the reconciliation of concepts and terminologies of all common databases forensic investigation processes. Thus, this will potentially facilitate the sharing of knowledge on database forensic investigation among domain stakeholders

    A Method of Mining Key Accounts from Internet Pyramid Selling Data

    Get PDF
    Internet pyramid selling causes great harm and has difficulty in obtaining evidence. As most of internet pyramid selling often uses virtual coins to complete cash settlement, criminals can exchange virtual coins by controlling visual accounts to transfer funds illegally. The purpose of the work is to mine the key accounts and build evidence chain of personnel and funds. In order to deal with a large number of transaction data, we characterize the virtual currency trading data, which is divided into seven transaction characteristics. The key accounts are these outliers which have too prominent trading behaviour. So we use positive samples to train One Class Support Vector Machine (OCSVM) classification to find trading behaviours’ boundary and detect outliers in classification. Then, the correlation between member’s information and pyramid selling hierarchical relationship can also be obtained by further linking the trading behaviour of key accounts. Finally, we can screen out the virtual accounts directly controlled by the pyramid selling organization, analyse the flow of funds, investigate and obtain evidence on amount of money involved in pyramid selling. The experimental results show that the classification model cannot only establish the normal model for account transfer behaviour, but also effectively identify abnormal transfer behaviour, thus improving efficiency of investigation and evidence collection for economic investigation departments

    CDBFIP: Common Database Forensic Investigation Processes for Internet of Things

    Get PDF
    Database forensics is a domain that uses database content and metadata to reveal malicious activities on database systems in an Internet of Things environment. Although the concept of database forensics has been around for a while, the investigation of cybercrime activities and cyber breaches in an Internet of Things environment would benefit from the development of a common investigative standard that unifies the knowledge in the domain. Therefore, this paper proposes common database forensic investigation processes using a design science research approach. The proposed process comprises four phases, namely: 1) identification; 2) artefact collection; 3) artefact analysis; and 4) the documentation and presentation process. It allows the reconciliation of the concepts and terminologies of all common database forensic investigation processes; hence, it facilitates the sharing of knowledge on database forensic investigation among domain newcomers, users, and practitioners

    Anonymity, hacking and cloud computing forensic challenges

    Get PDF
    Cloud Computing is rising and becomes more complex with the daily addition of new technologies. Huge amounts of data transits through the Cloud networks. In the case of a cyber-attack, it can be difficult to analyze every single aspect of the Cloud. Legal challenges also exist due to the local positioning of Cloud servers. This research paper aims to alleviate the challenges in Cloud computing forensics and to sensitize businesses and governments to several solutions. The results of this research are relevant to cyber forensic analysts but also to network administrators and can be used during the preliminary stages of a Cloud computing environment creation. A complete test has been created using ethical hacking tools and cyber forensics to understand the steps of an investigation in a single service that could be implemented in a Cloud. The paper goes on to present frameworks that have been developed in order to maintain integrity and repetition. In the end, it is legal aspects and shortcomings in the technical structure implementation that represent the Cloud computing forensics’ main challenges

    Significance of Semantic Reconciliation in Digital Forensics

    Get PDF
    Digital forensics (DF) is a growing field that is gaining popularity among many computer professionals, law enforcement agencies and other stakeholders who must always cooperate in this profession. Unfortunately, this has created an environment replete with semantic disparities within the domain that needs to be resolved and/or eliminated. For the purpose of this study, semantic disparity refers to disagreements about the meaning, interpretation, descriptions and the intended use of the same or related data and terminologies. If semantic disparity is not detected and resolved, it may lead to misunderstandings. Even worse, since the people involved may not be from the same neighbourhood, they may not be aware of the existence of the semantic disparities, and probably might not easily realize it. The aim of this paper, therefore, is to discuss semantic disparity in DF and further elaborates on how to manage it. In addition, this paper also presents the significance of semantic reconciliation in DF. Semantic reconciliation refers to reconciling the meaning (including the interpretations and descriptions) of terminologies and data used in digital forensics. Managing semantic disparities and the significance of semantic reconciliation in digital forensics constitutes the main contributions of this paper. Keywords: Digital forensics, semantic disparity, managing semantic disparity, semantic reconciliation, significance of semantic reconciliatio

    Preprocessing reference sensor pattern noise via spectrum equalization

    Get PDF
    Although sensor pattern noise (SPN) has been proven to be an effective means to uniquely identify digital cameras, some non-unique artifacts, shared amongst cameras undergo the same or similar in-camera processing procedures, often give rise to false identifications. Therefore, it is desirable and necessary to suppress these unwanted artifacts so as to improve the accuracy and reliability. In this work, we propose a novel preprocessing approach for attenuating the influence of the nonunique artifacts on the reference SPN to reduce the false identification rate. Specifically, we equalize the magnitude spectrum of the reference SPN through detecting and suppressing the peaks according to the local characteristics, aiming at removing the interfering periodic artifacts. Combined with 6 SPN extraction or enhancement methods, our proposed Spectrum Equalization Algorithm (SEA) is evaluated on the Dresden image database as well as our own database, and compared with the state-of-the-art preprocessing schemes. Experimental results indicate that the proposed procedure outperforms, or at least performs comparably to, the existing methods in terms of the overall ROC curve and kappa statistic computed from a confusion matrix, and tends to be more resistant to JPEG compression for medium and small image blocks
    corecore