718 research outputs found

    Contrasting singleton type-1 and interval type-2 non-singleton type-1 fuzzy logic systems

    Get PDF
    Most applications of both type-1 and type-2 fuzzy logic systems are employing singleton fuzzification due to its simplicity and reduction in its computational speed. However, using singleton fuzzification assumes that the input data (i.e., measurements) are precise with no uncertainty associated with them. This paper explores the potential of combining the uncertainty modelling capacity of interval type-2 fuzzy sets with the simplicity of type-1 fuzzy logic systems (FLSs) by using interval type-2 fuzzy sets solely as part of the non-singleton input fuzzifier. This paper builds on previous work and uses the methodological design of the footprint of uncertainty (FOU) of interval type-2 fuzzy sets for given levels of uncertainty. We provide a detailed investigation into the ability of both types of fuzzy sets (type-1 and interval type-2) to capture and model different levels of uncertainty/noise through varying the size of the FOU of the underlying input fuzzy sets from type-1 fuzzy sets to very “wide” interval type-2 fuzzy sets as part of type-1 non-singleton FLSs using interval type-2 input fuzzy sets. By applying the study in the context of chaotic time-series prediction, we show how, as uncertainty/noise increases, interval type-2 input fuzzy sets with FOUs of increasing size become more and more viable

    On transitioning from type-1 to interval type-2 fuzzy logic systems

    Get PDF
    Capturing the uncertainty arising from system noise has been a core feature of fuzzy logic systems (FLSs) for many years. This paper builds on previous work and explores the methodological transition of type-l (Tl) to interval type-2 fuzzy sets (IT2 FSs) for given "levels" of uncertainty. Specifically, we propose to transition from Tl to IT2 FLSs through varying the size of the Footprint Of Uncertainty (FOU) of their respective FSs while maintaining the original FS shape (e.g., triangular) and keeping the size of the FOU over the FS as constant as possible. The latter is important as it enables the systematic relating of FOU size to levels of uncertainty and vice versa, while the former enables an intuitive comparison between the Tl and T2 FSs. The effectiveness of the proposed method is demonstrated through a series of experiments using the well-known Mackey-Glass (MG) time series prediction problem. The results are compared with the results of the IT2 FS creation method introduced in [1] which follows a similar methodology as the proposed approach but does not maintain the membership function (MF) shape

    Classification of EMG Signals using Wavelet Features and Fuzzy Logic Classifiers

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Interval Type-2 Fuzzy Logic Systems Made Simple

    Full text link

    Analysis and Applications of the Km Algorithm in Type-2 Fuzzy Logic Control and Decision Making

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    On non-monotonic Choquet integrals as aggregation functions

    Get PDF
    This paper deals with non-monotonic Choquet integral, a generalization of the regular Choquet integral. The discrete non-monotonic Choquet integral is considered under the viewpoint of aggregation. In particular we give an axiomatic characterization of the class of non-monotonic Choquet integrals.We show how the Shapley index, in contrast with the monotonic case, can assume positive values if the criterion is in average a benefit, depending on its effect in all the possible coalition coalitions, and negative values in the opposite case of a cost criterion.

    Fuzzy Logic Is Not Fuzzy: World-renowned Computer Scientist Lotfi A. Zadeh

    Get PDF
    In 1965 Lotfi A. Zadeh published "Fuzzy Sets", his pioneering and controversial paper, that now reaches almost 100,000 citations. All Zadeh’s papers were cited over 185,000 times. Starting from the ideas presented in that paper, Zadeh founded later the Fuzzy Logic theory, that proved to have useful applications, from consumer to industrial intelligent products. We are presenting general aspects of Zadeh’s contributions to the development of Soft Computing(SC) and Artificial Intelligence(AI), and also his important and early influence in the world and in Romania. Several early contributions in fuzzy sets theory were published by Romanian scientists, such as: Grigore C. Moisil (1968), Constantin V. Negoita & Dan A. Ralescu (1974), Dan Butnariu (1978). In this review we refer the papers published in "From Natural Language to Soft Computing: New Paradigms in Artificial Intelligence" (2008, Eds.: L.A. Zadeh, D. Tufis, F.G. Filip, I. Dzitac), and also from the two special issues (SI) of the International Journal of Computers Communications & Control (IJCCC, founded in 2006 by I. Dzitac, F.G. Filip & M.J. Manolescu; L.A. Zadeh joined in 2008 to editorial board). In these two SI, dedicated to the 90th birthday of Lotfi A. Zadeh (2011), and to the 50th anniversary of "Fuzzy Sets" (2015), were published some papers authored by scientists from Algeria, Belgium, Canada, Chile, China, Hungary, Greece, Germany, Japan, Lithuania, Mexico, Pakistan, Romania, Saudi Arabia, Serbia, Spain, Taiwan, UK and USA

    Accuracy and complexity evaluation of defuzzification strategies for the discretised interval type-2 fuzzy set.

    Get PDF
    Other research group involved: Centre for Computational Intelligence (CCI).The work reported in this paper addresses the challenge of the efficient and accurate defuzzification of discretised interval type-2 fuzzy sets. The exhaustive method of defuzzification for type-2 fuzzy sets is extremely slow, owing to its enormous computational complexity. Several approximate methods have been devised in response to this bottleneck. In this paper we survey four alternative strategies for defuzzifying an interval type-2 fuzzy set: 1. The Karnik-Mendel Iterative Procedure, 2. the Wu-Mendel Approximation, 3. the Greenfield-Chiclana Collapsing Defuzzifier, and 4. the Nie-Tan Method. We evaluated the different methods experimentally for accuracy, by means of a comparative study using six representative test sets with varied characteristics, using the exhaustive method as the standard. A preliminary ranking of the methods was achieved using a multi-criteria decision making methodology based on the assignment of weights according to performance. The ranking produced, in order of decreasing accuracy, is 1. the Collapsing Defuzzifier, 2. the Nie-Tan Method, 3. the Karnik-Mendel Iterative Procedure, and 4. the Wu-Mendel Approximation. Following that, a more rigorous analysis was undertaken by means of the Wilcoxon Nonparametric Test, in order to validate the preliminary test conclusions. It was found that there was no evidence of a significant difference between the accuracy of the Collapsing and Nie-Tan Methods, and between that of the Karnik-Mendel Iterative Procedure and the Wu-Mendel Approximation. However, there was evidence to suggest that the collapsing and Nie-Tan Methods are more accurate than the Karnik-Mendel Iterative Procedure and the Wu-Mendel Approximation. In relation to efficiency, each method’s computational complexity was analysed, resulting in a ranking (from least computationally complex to most computationally complex) as follows: 1. the Nie-Tan Method, 2. the Karnik-Mendel Iterative Procedure (lowest complexity possible), 3. the Greenfield-Chiclana Collapsing Defuzzifier, 4. the Karnik-Mendel Iterative Procedure (highest complexity possible), and 5. the Wu-Mendel Approximation
    corecore