81,854 research outputs found

    Guest Editorial: Hybrid intelligent fusion systems

    Get PDF
    This special issue covers topics related to information fusion in the context of hybrid intelligent systems which are becoming popular due to their capabilities in handling many real world complex problems, involving imprecision, uncertainty, vagueness and high-dimensionality. They provide us with the opportunity to use both, our knowledge and raw data to solve problems in a more interesting and promising way. This multidisciplinary research field is in continuous expansion in the artificial intelligence research community. One of the most promising areas of classifier systems is that of combined classifiers, which is currently the focus of intense research. Information fusion helps to overcome limitations of traditional approaches based on single classifiers thereby opening new areas of research. Accordingly, the current issue presents a survey and five research papers dealing with recent aspects of the hybrid systems where information fusion plays a relevant role. The issue includes comments on one of the articles and the author response to the comments. The special issue starts with the survey article by Wozniak et al., in which the authors introduce along with a short overview about the recent history, the state-of-the-art, and some key research areas of hybrid intelligent systems related to pattern recognition and optimization. Three of the five research papers focus on hybrid classifiers, while the other two are related to data fusion for hybrid modeling. More specifically, the first paper, by Lin and Jiang, presents two new hybrid weighted averaging operators for aggregating crisp and fuzzy information, whose desirable formal properties are studied in depth. Additionally, three special types of preferred centroid of triangular fuzzy number are defined with respect to the proposed operators and two novel decision algorithms are developed. The paper content, thanks to prior-to-print online access, has been the subject of a fruitful discussion which is reflected in the comments by Wang arguing on the non-monotonicity of the aggregation operators, and the reply by Lin providing additional proofs of their monotonic properties. The next paper, by Olatunji et al., also follows a fuzzy approach proposing a combination of type-2 fuzzy logic systems and extreme learning machine to model permeability of carbonate reservoir. The comparative computer experiments show that the hybrid classifier outperforms conventional artificial neural networks and support vector machines for the problem under consideration. Appearing third is the study, by Tsai, which describes a novel hybrid financial distress model based on combining the clustering technique and classifier ensembles. The author uses both techniques to develop different types of bankruptcy prediction models. Up to 21 different models are produced as combination of unsupervised and supervised classification techniques. Tests evaluating prediction accuracy show that the SOM and the traditional MLP provide the best results. In the next article, by Hernández et al., an innovative information fusion process of ecological and remote sensing data is proposed. It is based on spatial interpolation methods to provide high resolution accurate estimations of the Leaf Area Index (LAI) which is a critical input variable for dynamical models of the biomass, based on the modelling of the interactions between the soil, the atmosphere and the vegetation models. The information sources used are the in situ field measurements, the remote sensing images and the altitude data obtained from digital elevation maps. The last research paper, by Kaburlasos and Pachidis, elaborates on the properties of a lattice computing approach based on intervals numbers to deal with disparate data types in a unified framework. Fusion of heterogeneous information sources is achieved on the basis of lattice theoretical formalization. Besides, authors propose an ensemble of fuzzy lattice reasoning classifiers, involving information fusion at the classifier output level in a lattice computing framework. The approach is successfully tested in a real industrial application of beverage brewing control. To conclude we would like to thank Belur V. Dasarathy, Editor-in-Chief of Information Fusion journal, for giving us the opportunity of preparing this special issue. We would also like to thank the reviewers for contributing to this issue with their work and time, and all the authors who submitted papers to the issue

    Commercial objectives, technology transfer, and systems analysis for fusion power development

    Get PDF
    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications

    A Survey on IT-Techniques for a Dynamic Emergency Management in Large Infrastructures

    Get PDF
    This deliverable is a survey on the IT techniques that are relevant to the three use cases of the project EMILI. It describes the state-of-the-art in four complementary IT areas: Data cleansing, supervisory control and data acquisition, wireless sensor networks and complex event processing. Even though the deliverable’s authors have tried to avoid a too technical language and have tried to explain every concept referred to, the deliverable might seem rather technical to readers so far little familiar with the techniques it describes

    Uncertainty in Ontologies: Dempster-Shafer Theory for Data Fusion Applications

    Full text link
    Nowadays ontologies present a growing interest in Data Fusion applications. As a matter of fact, the ontologies are seen as a semantic tool for describing and reasoning about sensor data, objects, relations and general domain theories. In addition, uncertainty is perhaps one of the most important characteristics of the data and information handled by Data Fusion. However, the fundamental nature of ontologies implies that ontologies describe only asserted and veracious facts of the world. Different probabilistic, fuzzy and evidential approaches already exist to fill this gap; this paper recaps the most popular tools. However none of the tools meets exactly our purposes. Therefore, we constructed a Dempster-Shafer ontology that can be imported into any specific domain ontology and that enables us to instantiate it in an uncertain manner. We also developed a Java application that enables reasoning about these uncertain ontological instances.Comment: Workshop on Theory of Belief Functions, Brest: France (2010

    Flow-based reputation with uncertainty: Evidence-Based Subjective Logic

    Full text link
    The concept of reputation is widely used as a measure of trustworthiness based on ratings from members in a community. The adoption of reputation systems, however, relies on their ability to capture the actual trustworthiness of a target. Several reputation models for aggregating trust information have been proposed in the literature. The choice of model has an impact on the reliability of the aggregated trust information as well as on the procedure used to compute reputations. Two prominent models are flow-based reputation (e.g., EigenTrust, PageRank) and Subjective Logic based reputation. Flow-based models provide an automated method to aggregate trust information, but they are not able to express the level of uncertainty in the information. In contrast, Subjective Logic extends probabilistic models with an explicit notion of uncertainty, but the calculation of reputation depends on the structure of the trust network and often requires information to be discarded. These are severe drawbacks. In this work, we observe that the `opinion discounting' operation in Subjective Logic has a number of basic problems. We resolve these problems by providing a new discounting operator that describes the flow of evidence from one party to another. The adoption of our discounting rule results in a consistent Subjective Logic algebra that is entirely based on the handling of evidence. We show that the new algebra enables the construction of an automated reputation assessment procedure for arbitrary trust networks, where the calculation no longer depends on the structure of the network, and does not need to throw away any information. Thus, we obtain the best of both worlds: flow-based reputation and consistent handling of uncertainties

    Enabling Explainable Fusion in Deep Learning with Fuzzy Integral Neural Networks

    Full text link
    Information fusion is an essential part of numerous engineering systems and biological functions, e.g., human cognition. Fusion occurs at many levels, ranging from the low-level combination of signals to the high-level aggregation of heterogeneous decision-making processes. While the last decade has witnessed an explosion of research in deep learning, fusion in neural networks has not observed the same revolution. Specifically, most neural fusion approaches are ad hoc, are not understood, are distributed versus localized, and/or explainability is low (if present at all). Herein, we prove that the fuzzy Choquet integral (ChI), a powerful nonlinear aggregation function, can be represented as a multi-layer network, referred to hereafter as ChIMP. We also put forth an improved ChIMP (iChIMP) that leads to a stochastic gradient descent-based optimization in light of the exponential number of ChI inequality constraints. An additional benefit of ChIMP/iChIMP is that it enables eXplainable AI (XAI). Synthetic validation experiments are provided and iChIMP is applied to the fusion of a set of heterogeneous architecture deep models in remote sensing. We show an improvement in model accuracy and our previously established XAI indices shed light on the quality of our data, model, and its decisions.Comment: IEEE Transactions on Fuzzy System
    corecore