3,217 research outputs found

    MEBN-RM: A Mapping between Multi-Entity Bayesian Network and Relational Model

    Full text link
    Multi-Entity Bayesian Network (MEBN) is a knowledge representation formalism combining Bayesian Networks (BN) with First-Order Logic (FOL). MEBN has sufficient expressive power for general-purpose knowledge representation and reasoning. Developing a MEBN model to support a given application is a challenge, requiring definition of entities, relationships, random variables, conditional dependence relationships, and probability distributions. When available, data can be invaluable both to improve performance and to streamline development. By far the most common format for available data is the relational database (RDB). Relational databases describe and organize data according to the Relational Model (RM). Developing a MEBN model from data stored in an RDB therefore requires mapping between the two formalisms. This paper presents MEBN-RM, a set of mapping rules between key elements of MEBN and RM. We identify links between the two languages (RM and MEBN) and define four levels of mapping from elements of RM to elements of MEBN. These definitions are implemented in the MEBN-RM algorithm, which converts a relational schema in RM to a partial MEBN model. Through this research, the software has been released as a MEBN-RM open-source software tool. The method is illustrated through two example use cases using MEBN-RM to develop MEBN models: a Critical Infrastructure Defense System and a Smart Manufacturing System

    Overview of contextual tracking approaches in information fusion

    Get PDF
    Proceedings of: Geospatial InfoFusion III. 2-3 May 2013 Baltimore, Maryland, United States.Many information fusion solutions work well in the intended scenarios; but the applications, supporting data, and capabilities change over varying contexts. One example is weather data for electro-optical target trackers of which standards have evolved over decades. The operating conditions of: technology changes, sensor/target variations, and the contextual environment can inhibit performance if not included in the initial systems design. In this paper, we seek to define and categorize different types of contextual information. We describe five contextual information categories that support target tracking: (1) domain knowledge from a user to aid the information fusion process through selection, cueing, and analysis, (2) environment-to-hardware processing for sensor management, (3) known distribution of entities for situation/threat assessment, (4) historical traffic behavior for situation awareness patterns of life (POL), and (5) road information for target tracking and identification. Appropriate characterization and representation of contextual information is needed for future high-level information fusion systems design to take advantage of the large data content available for a priori knowledge target tracking algorithm construction, implementation, and application.Publicad

    Context Relevant Prediction Model for COPD Domain Using Bayesian Belief Network

    Get PDF
    In the last three decades, researchers have examined extensively how context-aware systems can assist people, specifically those suffering from incurable diseases, to help them cope with their medical illness. Over the years, a huge number of studies on Chronic Obstructive Pulmonary Disease (COPD) have been published. However, how to derive relevant attributes and early detection of COPD exacerbations remains a challenge. In this research work, we will use an efficient algorithm to select relevant attributes where there is no proper approach in this domain. Such algorithm predicts exacerbations with high accuracy by adding discretization process, and organizes the pertinent attributes in priority order based on their impact to facilitate the emergency medical treatment. In this paper, we propose an extension of our existing Helper Context-Aware Engine System (HCES) for COPD. This project uses Bayesian network algorithm to depict the dependency between the COPD symptoms (attributes) in order to overcome the insufficiency and the independency hypothesis of naïve Bayesian. In addition, the dependency in Bayesian network is realized using TAN algorithm rather than consulting pneumologists. All these combined algorithms (discretization, selection, dependency, and the ordering of the relevant attributes) constitute an effective prediction model, comparing to effective ones. Moreover, an investigation and comparison of different scenarios of these algorithms are also done to verify which sequence of steps of prediction model gives more accurate results. Finally, we designed and validated a computer-aided support application to integrate different steps of this model. The findings of our system HCES has shown promising results using Area Under Receiver Operating Characteristic (AUC = 81.5%)

    SoK: Contemporary Issues and Challenges to Enable Cyber Situational Awareness for Network Security

    Get PDF
    Cyber situational awareness is an essential part of cyber defense that allows the cybersecurity operators to cope with the complexity of today's networks and threat landscape. Perceiving and comprehending the situation allow the operator to project upcoming events and make strategic decisions. In this paper, we recapitulate the fundamentals of cyber situational awareness and highlight its unique characteristics in comparison to generic situational awareness known from other fields. Subsequently, we provide an overview of existing research and trends in publishing on the topic, introduce front research groups, and highlight the impact of cyber situational awareness research. Further, we propose an updated taxonomy and enumeration of the components used for achieving cyber situational awareness. The updated taxonomy conforms to the widely-accepted three-level definition of cyber situational awareness and newly includes the projection level. Finally, we identify and discuss contemporary research and operational challenges, such as the need to cope with rising volume, velocity, and variety of cybersecurity data and the need to provide cybersecurity operators with the right data at the right time and increase their value through visualization

    Intelligent Data Analytics using Deep Learning for Data Science

    Get PDF
    Nowadays, data science stimulates the interest of academics and practitioners because it can assist in the extraction of significant insights from massive amounts of data. From the years 2018 through 2025, the Global Datasphere is expected to rise from 33 Zettabytes to 175 Zettabytes, according to the International Data Corporation. This dissertation proposes an intelligent data analytics framework that uses deep learning to tackle several difficulties when implementing a data science application. These difficulties include dealing with high inter-class similarity, the availability and quality of hand-labeled data, and designing a feasible approach for modeling significant correlations in features gathered from various data sources. The proposed intelligent data analytics framework employs a novel strategy for improving data representation learning by incorporating supplemental data from various sources and structures. First, the research presents a multi-source fusion approach that utilizes confident learning techniques to improve the data quality from many noisy sources. Meta-learning methods based on advanced techniques such as the mixture of experts and differential evolution combine the predictive capacity of individual learners with a gating mechanism, ensuring that only the most trustworthy features or predictions are integrated to train the model. Then, a Multi-Level Convolutional Fusion is presented to train a model on the correspondence between local-global deep feature interactions to identify easily confused samples of different classes. The convolutional fusion is further enhanced with the power of Graph Transformers, aggregating the relevant neighboring features in graph-based input data structures and achieving state-of-the-art performance on a large-scale building damage dataset. Finally, weakly-supervised strategies, noise regularization, and label propagation are proposed to train a model on sparse input labeled data, ensuring the model\u27s robustness to errors and supporting the automatic expansion of the training set. The suggested approaches outperformed competing strategies in effectively training a model on a large-scale dataset of 500k photos, with just about 7% of the images annotated by a human. The proposed framework\u27s capabilities have benefited various data science applications, including fluid dynamics, geometric morphometrics, building damage classification from satellite pictures, disaster scene description, and storm-surge visualization

    An early-stage decision-support framework for the implementation of intelligent automation

    Get PDF
    The constant pressure on manufacturing companies to improve productivity, reduce the lead time and progress in quality requires new technological developments and adoption.The rapid development of smart technology and robotics and autonomous systems (RAS) technology has a profound impact on manufacturing automation and might determine winners and losers of the next generation’s manufacturing competition. Simultaneously, recent smart technology developments in the areas enable an automation response to new production paradigms such as mass customisation and product-lifecycle considerations in the context of Industry 4.0. New paradigms, like mass customisation, increased both the complexity of the tasks and the risk due to smart technology integration. From a manufacturing automation perspective, intelligent automation has been identified as a possible response to arising demands. The presented research aims to support the industrial uptake of intelligent automation into manufacturing businesses by quantifying risks at the early design stage and business case development. An early-stage decision-support framework for the implementation of intelligent automation in manufacturing businesses is presented in this thesis.The framework is informed by an extensive literature review, updated and verified with surveys and workshops to add to the knowledge base due to the rapid development of the associated technologies. A paradigm shift from cost to a risk-modelling perspective is proposed to provide a more flexible and generic approach applicable throughout the current technology landscape. The proposed probabilistic decision-support framework consists of three parts:• A clustering algorithm to identify the manufacturing functions in manual processes from task analysis to mitigate early-stage design uncertainties• A Bayesian Belief Network (BBN) informed by an expert elicitation via the DELPHI method, where the identified functions become the unit of analysis.• A Markov-Chain Monte-Carlo method modelling the effects of uncertainties on the critical success factors to address issues of factor interdependencies after expert elicitation.Based on the overall decision framework a toolbox was developed in Microsoft Excel. Five different case studies are used to test and validate the framework. Evaluation of the results derived from the toolbox from the industrial feedback suggests a positive validation for commercial use. The main contributions to knowledge in the presented thesis arise from the following four points:• Early-stage decision-support framework for business case evaluation of intelligent automation.• Translating manual tasks to automation function via a novel clustering approach• Application of a Markov-Chain Monte-Carlo Method to simulate correlation between decision criteria• Causal relationship among Critical Success Factors has been established from business and technical perspectives.The implications on practise might be promising. The feedback arising from the created tool was promising from the industry, and a practical realisation of the decision-support tool seems to be desired from an industrial point of view.With respect to further work, the decision-support tool might have established a ground to analyse a human task automatically for automation purposes. The established clustering mechanisms and the related attributes could be connected to sensorial data and analyse a manufacturing task autonomously without the subjective input of task analysis experts. To enable such an autonomous process, however, the psychophysiological understanding must be increased in the future.</div
    corecore