83 research outputs found

    An Overview of Classifier Fusion Methods

    Get PDF
    A number of classifier fusion methods have been recently developed opening an alternative approach leading to a potential improvement in the classification performance. As there is little theory of information fusion itself, currently we are faced with different methods designed for different problems and producing different results. This paper gives an overview of classifier fusion methods and attempts to identify new trends that may dominate this area of research in future. A taxonomy of fusion methods trying to bring some order into the existing “pudding of diversities” is also provided

    An Overview of Classifier Fusion Methods

    Get PDF
    A number of classifier fusion methods have been recently developed opening an alternative approach leading to a potential improvement in the classification performance. As there is little theory of information fusion itself, currently we are faced with different methods designed for different problems and producing different results. This paper gives an overview of classifier fusion methods and attempts to identify new trends that may dominate this area of research in future. A taxonomy of fusion methods trying to bring some order into the existing “pudding of diversities” is also provided

    Efficient Data Driven Multi Source Fusion

    Get PDF
    Data/information fusion is an integral component of many existing and emerging applications; e.g., remote sensing, smart cars, Internet of Things (IoT), and Big Data, to name a few. While fusion aims to achieve better results than what any one individual input can provide, often the challenge is to determine the underlying mathematics for aggregation suitable for an application. In this dissertation, I focus on the following three aspects of aggregation: (i) efficient data-driven learning and optimization, (ii) extensions and new aggregation methods, and (iii) feature and decision level fusion for machine learning with applications to signal and image processing. The Choquet integral (ChI), a powerful nonlinear aggregation operator, is a parametric way (with respect to the fuzzy measure (FM)) to generate a wealth of aggregation operators. The FM has 2N variables and N(2N − 1) constraints for N inputs. As a result, learning the ChI parameters from data quickly becomes impractical for most applications. Herein, I propose a scalable learning procedure (which is linear with respect to training sample size) for the ChI that identifies and optimizes only data-supported variables. As such, the computational complexity of the learning algorithm is proportional to the complexity of the solver used. This method also includes an imputation framework to obtain scalar values for data-unsupported (aka missing) variables and a compression algorithm (lossy or losselss) of the learned variables. I also propose a genetic algorithm (GA) to optimize the ChI for non-convex, multi-modal, and/or analytical objective functions. This algorithm introduces two operators that automatically preserve the constraints; therefore there is no need to explicitly enforce the constraints as is required by traditional GA algorithms. In addition, this algorithm provides an efficient representation of the search space with the minimal set of vertices. Furthermore, I study different strategies for extending the fuzzy integral for missing data and I propose a GOAL programming framework to aggregate inputs from heterogeneous sources for the ChI learning. Last, my work in remote sensing involves visual clustering based band group selection and Lp-norm multiple kernel learning based feature level fusion in hyperspectral image processing to enhance pixel level classification

    Bibliometric Mapping of the Computational Intelligence Field

    Get PDF
    In this paper, a bibliometric study of the computational intelligence field is presented. Bibliometric maps showing the associations between the main concepts in the field are provided for the periods 1996–2000 and 2001–2005. Both the current structure of the field and the evolution of the field over the last decade are analyzed. In addition, a number of emerging areas in the field are identified. It turns out that computational intelligence can best be seen as a field that is structured around four important types of problems, namely control problems, classification problems, regression problems, and optimization problems. Within the computational intelligence field, the neural networks and fuzzy systems subfields are fairly intertwined, whereas the evolutionary computation subfield has a relatively independent position.neural networks;bibliometric mapping;fuzzy systems;bibliometrics;computational intelligence;evolutionary computation

    Machine Learning Methods for Fuzzy Pattern Tree Induction

    Get PDF
    This thesis elaborates on a novel approach to fuzzy machine learning, that is, the combination of machine learning methods with mathematical tools for modeling and information processing based on fuzzy logic. More specifically, the thesis is devoted to so-called fuzzy pattern trees, a model class that has recently been introduced for representing dependencies between input and output variables in supervised learning tasks, such as classification and regression. Due to its hierarchical, modular structure and the use of different types of (nonlinear) aggregation operators, a fuzzy pattern tree has the ability to represent such dependencies in a very exible and compact way, thereby offering a reasonable balance between accuracy and model transparency. The focus of the thesis is on novel algorithms for pattern tree induction, i.e., for learning fuzzy pattern trees from observed data. In total, three new algorithms are introduced and compared to an existing method for the data-driven construction of pattern trees. While the first two algorithms are mainly geared toward an improvement of predictive accuracy, the last one focuses on eficiency aspects and seeks to make the learning process faster. The description and discussion of each algorithm is complemented with theoretical analyses and empirical studies in order to show the effectiveness of the proposed solutions

    NIDUS IDEARUM. Scilogs, V: joining the dots

    Get PDF
    My lab[oratory] is a virtual facility with noncontrolled conditions in which I mostly perform scientific meditation and chats: a nest of ideas (nidus idearum, in Latin). I called the jottings herein scilogs (truncations of the words scientific, and gr. Λόγος – appealing rather to its original meanings ground , opinion , expectation ), combining the welly of both science and informal (via internet) talks (in English, French, and Romanian). * In this fifth book of scilogs collected from my nest of ideas, one may find new and old questions and solutions, mostly referring to topics on NEUTROSOPHY – email messages to research colleagues, or replies, notes about authors, articles, or books, so on. Feel free to budge in or just use the scilogs as open source for your own ideas

    From approximative to descriptive fuzzy models

    Get PDF

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected works), Vol. 2

    Get PDF
    This second volume dedicated to Dezert-Smarandache Theory (DSmT) in Information Fusion brings in new fusion quantitative rules (such as the PCR1-6, where PCR5 for two sources does the most mathematically exact redistribution of conflicting masses to the non-empty sets in the fusion literature), qualitative fusion rules, and the Belief Conditioning Rule (BCR) which is different from the classical conditioning rule used by the fusion community working with the Mathematical Theory of Evidence. Other fusion rules are constructed based on T-norm and T-conorm (hence using fuzzy logic and fuzzy set in information fusion), or more general fusion rules based on N-norm and N-conorm (hence using neutrosophic logic and neutrosophic set in information fusion), and an attempt to unify the fusion rules and fusion theories. The known fusion rules are extended from the power set to the hyper-power set and comparison between rules are made on many examples. One defines the degree of intersection of two sets, degree of union of two sets, and degree of inclusion of two sets which all help in improving the all existing fusion rules as well as the credibility, plausibility, and communality functions. The book chapters are written by Frederic Dambreville, Milan Daniel, Jean Dezert, Pascal Djiknavorian, Dominic Grenier, Xinhan Huang, Pavlina Dimitrova Konstantinova, Xinde Li, Arnaud Martin, Christophe Osswald, Andrew Schumann, Tzvetan Atanasov Semerdjiev, Florentin Smarandache, Albena Tchamova, and Min Wang

    Fuzzy Operator Trees for Modeling Utility Functions

    Get PDF
    In this thesis, we propose a method for modeling utility (rating) functions based on a novel concept called textbf{Fuzzy Operator Tree} (FOT for short). As the notion suggests, this method makes use of techniques from fuzzy set theory and implements a fuzzy rating function, that is, a utility function that maps to the unit interval, where 00 corresponds to the lowest and 11 to the highest evaluation. Even though the original motivation comes from quality control, FOTs are completely general and widely applicable. Our approach allows a human expert to specify a model in the form of an FOT in a quite convenient and intuitive way. To this end, he simply has to split evaluation criteria into sub-criteria in a recursive manner, and to determine in which way these sub-criteria ought to be combined: conjunctively, disjunctively, or by means of an averaging operator. The result of this process is the qualitative structure of the model. A second step, then, it is to parameterize the model. To support or even free the expert form this step, we develop a method for calibrating the model on the basis of exemplary ratings, that is, in a purely data-driven way. This method, which makes use of optimization techniques from the field of evolutionary algorithms, constitutes the second major contribution of the thesis. The third contribution of the thesis is a method for evaluating an FOT in a cost-efficient way. Roughly speaking, an FOT can be seen as an aggregation function that combines the evaluations of a number of basic criteria into an overall rating of an object. Essentially, the cost of computing this rating is hence given by sum of the evaluation costs of the basic criteria. In practice, however, the precise utility degree is often not needed. Instead, it is enough to know whether it lies above or below an important threshold value. In such cases, the evaluation process, understood as a sequential evaluation of basic criteria, can be stopped as soon as this question can be answered in a unique way. Of course, the (expected) number of basic criteria and, therefore, the (expected) evaluation cost will then strongly depend on the order of the evaluations, and this is what is optimized by the methods that we have developed

    Automatic synthesis of fuzzy systems: An evolutionary overview with a genetic programming perspective

    Get PDF
    Studies in Evolutionary Fuzzy Systems (EFSs) began in the 90s and have experienced a fast development since then, with applications to areas such as pattern recognition, curve‐fitting and regression, forecasting and control. An EFS results from the combination of a Fuzzy Inference System (FIS) with an Evolutionary Algorithm (EA). This relationship can be established for multiple purposes: fine‐tuning of FIS's parameters, selection of fuzzy rules, learning a rule base or membership functions from scratch, and so forth. Each facet of this relationship creates a strand in the literature, as membership function fine‐tuning, fuzzy rule‐based learning, and so forth and the purpose here is to outline some of what has been done in each aspect. Special focus is given to Genetic Programming‐based EFSs by providing a taxonomy of the main architectures available, as well as by pointing out the gaps that still prevail in the literature. The concluding remarks address some further topics of current research and trends, such as interpretability analysis, multiobjective optimization, and synthesis of a FIS through Evolving methods
    • …
    corecore