991 research outputs found

    Introduction to IntelliSIM 1.0

    Get PDF
    IntelliSIM is a prototype for a new generation of knowledge-based simulation tool that has been developed by the Systems Simulation Laboratory at Arizona State University. This tool is a computer environment that allows non-simulation trained modelers to predict the performance of a manufacturing system for which the necessary data is available. The system provides predictive data on such items as throughput time, queue levels, equipment utilization, reactions to machine failures, etc. With IntelliSIM, the benefits of discrete-event simulation can be exploited without requiring the high level of expertise necessary to successfully conduct a sound simulation study. The approach offered with IntelliSIM is one which will offer substantial savings over currently available simulation tools. This document is Version 1 (1992) of the user manual for the IntelliSIM software

    Introduction to IntelliSIM 1.0

    Get PDF
    IntelliSIM is a prototype for a new generation of knowledge-based simulation tool that has been developed by the Systems Simulation Laboratory at Arizona State University. This tool is a computer environment that allows non-simulation trained modelers to predict the performance of a manufacturing system for which the necessary data is available. The system provides predictive data on such items as throughput time, queue levels, equipment utilization, reactions to machine failures, etc. With IntelliSIM, the benefits of discrete-event simulation can be exploited without requiring the high level of expertise necessary to successfully conduct a sound simulation study. The approach offered with IntelliSIM is one which will offer substantial savings over currently available simulation tools. This document is Version 1 (1992) of the user manual for the IntelliSIM software

    Classification Techniques In Blood Donors Sector – A Survey

    Get PDF
    This paper focuses on the classification and the recent trends associated with it. It presents a survey of the classification system and clarifies how classification and data mining are related both to each other. Classification is arranging the blood donor dataset into the predefined group and helpful to predict group membership for data instances. This enables users to search target donors become easier because the blood stocks always required replacing expired stocks after a certain period and useful in emergency demands such as surgery and blood transfusion. This paper has also sought to identify the research area in classification to fulfill gaps where further work can be carried on

    Semantic analysis in the automation of ER modelling through natural language processing

    Get PDF

    Architectural Layer Recovery for Software System Understanding and Evolution

    Get PDF
    This paper presents an approach to identify software layers for the understanding and evolution of software systems implemented with any object-oriented programming language. The approach first identifies relations between the classes of a software system and then uses a link analysis algorithm (i.e. the Kleinberg algorithm) to group them into layers. Additionally to assess the approach and the underlying techniques, the paper also presents a prototype of a supporting tool and the results from a case study

    Heuristic-based entity-relationship modelling through natural language processing

    Get PDF

    X-IM Framework to Overcome Semantic Heterogeneity Across XBRL Filings

    Get PDF
    Semantic heterogeneity in XBRL precludes the full automation of the business reporting pipeline, a key motivation for the SEC’s XBRL mandate. To mitigate this problem, several approaches leveraging Semantic Web technologies have emerged. While some approaches are promising, their mapping accuracy in resolving semantic heterogeneity must be improved to realize the promised benefits of XBRL. Considering this limitation and following the design science research methodology (DSRM), we develop a novel framework, XBRL indexing-based mapping (X-IM), which takes advantage of the representational model of representation theory to map heterogeneous XBRL elements across diverse XBRL filings. The application of representation theory to the design process informs the use of XBRL label linkbases as a repository of regularities constitutive of the relationships between financial item names and the concepts they describe along a set of equivalent financial terms of interest to investors. The instantiated design artifact is thoroughly evaluated using standard information retrieval metrics. Our experiments show that X-IM significantly outperforms existing methods

    Leveraging Data Mining and Data Warehouse to Improve Prison Services and Operations in Nigeria

    Get PDF
    Crimes are social nuisance and cost our society dearly in several ways. In Nigeria, any research geared towards helping to solve crimes faster will be beneficial to the society at large. It has been observed that the major challenge facing all law-enforcement and intelligence-gathering organizations in Nigeria is how to accurately and efficiently analyze the growing volume of crime data. As the volume of this crime data becomes enormously large, new techniques have to be used to turn this data into valuable information and actionable knowledge so that appropriate actions can be taken accordingly. Sometimes it is usual to find that the data needed to be analyzed to produce report are scattered throughout different operational States and jurisdictions of Nigeria and must first be carefully integrated. Moreover, observations show that the process required to extract the existing data from each operational system demand so much of the system resources such that the IT professional must wait until nonoperational hours before running targeted queries required for producing operational reports. These delays are not only time-consuming and frustrating for both the IT professionals and the decision-makers they are dangerous for the sector whose primary task is to control crime spread and explosion. It should be noted that when such operational reports are finally produced, they may not be relied upon, because the data use in producing them many a times are inconsistent, inaccurate, or obsolete. This paper therefore highlights the increasing growing need for Data integration, Data warehouse and Data mining as ways to improve the operations of principal actors within the prisons sector of Nigeria. The paper explains what these Data management techniques mean and entail, and furthermore suggests ways to effectively leverage the techniques to help detect existing crime patterns and speed up the process of solving crimes. Keywords: Crime-data, data mining, data mining techniques, data warehouse, data integratio
    • …
    corecore