5,310 research outputs found

    A Semantic Reasoning Method Towards Ontological Model for Automated Learning Analysis

    Get PDF
    Semantic reasoning can help solve the problem of regulating the evolving and static measures of knowledge at theoretical and technological levels. The technique has been proven to enhance the capability of process models by making inferences, retaining and applying what they have learned as well as discovery of new processes. The work in this paper propose a semantic rule-based approach directed towards discovering learners interaction patterns within a learning knowledge base, and then respond by making decision based on adaptive rules centred on captured user profiles. The method applies semantic rules and description logic queries to build ontology model capable of automatically computing the various learning activities within a Learning Knowledge-Base, and to check the consistency of learning object/data types. The approach is grounded on inductive and deductive logic descriptions that allows the use of a Reasoner to check that all definitions within the learning model are consistent and can also recognise which concepts that fit within each defined class. Inductive reasoning is practically applied in order to discover sets of inferred learner categories, while deductive approach is used to prove and enhance the discovered rules and logic expressions. Thus, this work applies effective reasoning methods to make inferences over a Learning Process Knowledge-Base that leads to automated discovery of learning patterns/behaviour

    A review of the enabling methodologies for knowledge discovery from smart grids data

    Get PDF
    The large-scale deployment of pervasive sensors and decentralized computing in modern smart grids is expected to exponentially increase the volume of data exchanged by power system applications. In this context, the research for scalable and flexible methodologies aimed at supporting rapid decisions in a data rich, but information limited environment represents a relevant issue to address. To this aim, this paper investigates the role of Knowledge Discovery from massive Datasets in smart grid computing, exploring its various application fields by considering the power system stakeholder available data and knowledge extraction needs. In particular, the aim of this paper is dual. In the first part, the authors summarize the most recent activities developed in this field by the Task Force on “Enabling Paradigms for High-Performance Computing in Wide Area Monitoring Protective and Control Systems” of the IEEE PSOPE Technologies and Innovation Subcommittee. Differently, in the second part, the authors propose the development of a data-driven forecasting methodology, which is modeled by considering the fundamental principles of Knowledge Discovery Process data workflow. Furthermore, the described methodology is applied to solve the load forecasting problem for a complex user case, in order to emphasize the potential role of knowledge discovery in supporting post processing analysis in data-rich environments, as feedback for the improvement of the forecasting performances

    A Review of Meta-level Learning in the Context of Multi-component, Multi-level Evolving Prediction Systems.

    Get PDF
    The exponential growth of volume, variety and velocity of data is raising the need for investigations of automated or semi-automated ways to extract useful patterns from the data. It requires deep expert knowledge and extensive computational resources to find the most appropriate mapping of learning methods for a given problem. It becomes a challenge in the presence of numerous configurations of learning algorithms on massive amounts of data. So there is a need for an intelligent recommendation engine that can advise what is the best learning algorithm for a dataset. The techniques that are commonly used by experts are based on a trial and error approach evaluating and comparing a number of possible solutions against each other, using their prior experience on a specific domain, etc. The trial and error approach combined with the expert’s prior knowledge, though computationally and time expensive, have been often shown to work for stationary problems where the processing is usually performed off-line. However, this approach would not normally be feasible to apply on non-stationary problems where streams of data are continuously arriving. Furthermore, in a non-stationary environment the manual analysis of data and testing of various methods every time when there is a change in the underlying data distribution would be very difficult or simply infeasible. In that scenario and within an on-line predictive system, there are several tasks where Meta-learning can be used to effectively facilitate best recommendations including: 1) pre processing steps, 2) learning algorithms or their combination, 3) adaptivity mechanisms and their parameters, 4) recurring concept extraction, and 5) concept drift detection. However, while conceptually very attractive and promising, the Meta-learning leads to several challenges with the appropriate representation of the problem at a meta-level being one of the key ones. The goal of this review and our research is, therefore, to investigate Meta learning in general and the associated challenges in the context of automating the building, deployment and adaptation of multi-level and multi-component predictive system that evolve over time

    The threat nets approach to information system security risk analysis

    Get PDF

    The threat nets approach to information system security risk analysis

    Get PDF
    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concerns, it was established that the current approaches to information systems risk analysis do not provide logical recipes for quantifying threat impact and determining the cost-effectiveness of risk mitigation controls. Quantifying the likelihood of the threat and determining its potential impact is key in deciding whether to adopt a given information system or not. Therefore, this thesis proposes the Threat Nets Approach organized into 4 service recipes, namely: threat likelihood assessment service, threat impact evaluation service, return on investment assessment service and coordination management. The threat likelihood assessment service offers recipes for determining the likelihood of a threat. The threat impact evaluation service offers techniques of computing the impact of the threat on the organization. The return on investment assessment service offers recipes of determining the cost-effectiveness of threat mitigation controls. To support the application of the approach, a ThreNet tool was developed. The approach was evaluated by experts to ascertain its usability and usefulness. Evaluation of the Threat Nets Approach by the experts shows that it provides complete, usable and useful recipes for the assessment of; threat likelihood, threat impact and cost-effectiveness of threat mitigation controls. The results suggest that the application of Threat Nets approach is effective in quantifying risks to information system

    The threat nets approach to information system security risk analysis

    Get PDF
    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concerns, it was established that the current approaches to information systems risk analysis do not provide logical recipes for quantifying threat impact and determining the cost-effectiveness of risk mitigation controls. Quantifying the likelihood of the threat and determining its potential impact is key in deciding whether to adopt a given information system or not. Therefore, this thesis proposes the Threat Nets Approach organized into 4 service recipes, namely: threat likelihood assessment service, threat impact evaluation service, return on investment assessment service and coordination management. The threat likelihood assessment service offers recipes for determining the likelihood of a threat. The threat impact evaluation service offers techniques of computing the impact of the threat on the organization. The return on investment assessment service offers recipes of determining the cost-effectiveness of threat mitigation controls. To support the application of the approach, a ThreNet tool was developed. The approach was evaluated by experts to ascertain its usability and usefulness. Evaluation of the Threat Nets Approach by the experts shows that it provides complete, usable and useful recipes for the assessment of; threat likelihood, threat impact and cost-effectiveness of threat mitigation controls. The results suggest that the application of Threat Nets approach is effective in quantifying risks to information system
    • …
    corecore