2,708 research outputs found
Optimal Fuzzy Model Construction with Statistical Information using Genetic Algorithm
Fuzzy rule based models have a capability to approximate any continuous
function to any degree of accuracy on a compact domain. The majority of FLC
design process relies on heuristic knowledge of experience operators. In order
to make the design process automatic we present a genetic approach to learn
fuzzy rules as well as membership function parameters. Moreover, several
statistical information criteria such as the Akaike information criterion
(AIC), the Bhansali-Downham information criterion (BDIC), and the
Schwarz-Rissanen information criterion (SRIC) are used to construct optimal
fuzzy models by reducing fuzzy rules. A genetic scheme is used to design
Takagi-Sugeno-Kang (TSK) model for identification of the antecedent rule
parameters and the identification of the consequent parameters. Computer
simulations are presented confirming the performance of the constructed fuzzy
logic controller
Assessment of Sustainable Development
The objective of this paper is to introduce fuzzy set theory and develop fuzzy mathematical models to assess sustainable development based on context-dependent economic, ecological, and societal sustainability indicators. Membership functions are at the core of fuzzy models, and define the degree to which indicators contribute to development. Although a decision-making process regarding sustainable development is subjective, fuzzy set theory links human expectations about development, expressed in linguistic propositions, to numerical data, expressed in measurements of sustainability indicators. In the future, practical implementation of such models will be based on elicitation of expert knowledge to construct a membership function. The fuzzy models developed in this paper provide a novel approach to support decisions regarding sustainable development.agriculture;assessment;fuzzy set theory;sustainable development
Type-2 Fuzzy Logic: Circumventing the Defuzzification Bottleneck
Type-2 fuzzy inferencing for generalised, discretised type-2 fuzzy sets has been impeded by the computational complexity of the defuzzification stage of the fuzzy inferencing system. Indeed this stage is so complex computationally that it has come to be known as the defuzzification bottleneck. The computational complexity derives from the enormous number of embedded sets that have to be individually processed in order to effect defuzzification.
Two new approaches to type-2 defuzzification are presented, the sampling method and the Greenfield-Chiclana Collapsing Defuzzifier. The sampling method and its variant, elite sampling, are techniques for the defuzzification of generalised type-2 fuzzy sets. In these methods a relatively small sample of the totality of embedded sets is randomly selected and processed. The small sample size drastically reduces the computational complexity of the defuzzification process, so that it may be speedily accomplished.
The Greenfield-Chiclana Collapsing Defuzzifier relies upon the concept of the representative embedded set, which is an embedded set having the same defuzzified value as the type-2 fuzzy set that is to be defuzzified. By a process termed collapsing the type-2 fuzzy set is converted into a type-1 fuzzy set which, as an approximation to the representative embedded set, is known as the representative embedded set approximation. This type-1 fuzzy set is easily defuzzified to give the defuzzified value of the original type-2 fuzzy set. By this method the computational complexity of type-2 defuzzification is reduced enormously, since the representative embedded set approximation replaces the entire collection of embedded sets. The strategy was conceived as a generalised method, but so far only the interval version has been derived mathematically.
The grid method of discretisation for type-2 fuzzy sets is also introduced in this thesis.
Work on the defuzzification of type-2 fuzzy sets began around the turn of the millennium. Since that time a number of investigators have contributed methods in this area. These different approaches are surveyed, and the major methods implemented in code prior to their experimental evaluation. In these comparative experiments the grid method of defuzzification is employed. The experimental results show beyond doubt that the collapsing method performs the best of the interval alternatives. However, though the sampling method performs well experimentally, the results do not demonstrate it to be the best performing generalised technique
Fuzzy geometry, entropy, and image information
Presented here are various uncertainty measures arising from grayness ambiguity and spatial ambiguity in an image, and their possible applications as image information measures. Definitions are given of an image in the light of fuzzy set theory, and of information measures and tools relevant for processing/analysis e.g., fuzzy geometrical properties, correlation, bound functions and entropy measures. Also given is a formulation of algorithms along with management of uncertainties for segmentation and object extraction, and edge detection. The output obtained here is both fuzzy and nonfuzzy. Ambiguity in evaluation and assessment of membership function are also described
Constrained interval type-2 fuzzy sets
In many contexts, type-2 fuzzy sets are obtained from a type-1 fuzzy set to which we wish to add uncertainty. However, in the current type-2 representation there is no restriction on the shape of the footprint of uncertainty and the embedded sets that can be considered acceptable. This leads, usually, to the loss of the semantic relationship between the type-2 fuzzy set and the concept it models. As a consequence, the interpretability of some of the embedded sets and the explainability of the uncertainty measures obtained from them can decrease. To overcome these issues, constrained type-2 fuzzy sets have been proposed. However, no formal definitions for some of their key components (e.g. acceptable embedded sets) and constrained operations have been given. The goal of this paper is to provide some theoretical underpinning for the definition of constrained type-2 sets, their inferencing and defuzzification method. To conclude, the constrained inference framework is presented, applied to two real world cases and briefly compared to the standard interval type-2 inference and defuzzification method
Perfectly normal type-2 fuzzy interpolation B-spline curve
In this paper, we proposed another new form of type-2 fuzzy data
points(T2FDPs) that is perfectly normal type-2 data points(PNT2FDPs). These
kinds of brand-new data were defined by using the existing type-2 fuzzy set
theory(T2FST) and type-2 fuzzy number(T2FN) concept since we dealt with the
problem of defining complex uncertainty data. Along with this restructuring, we
included the fuzzification(alpha-cut operation), type-reduction and
defuzzification processes against PNT2FDPs. In addition, we used interpolation
B-soline curve function to demonstrate the PNT2FDPs.Comment: arXiv admin note: substantial text overlap with arXiv:1304.786
- …