12 research outputs found

    Fuzzy Regression by Fuzzy Number Neural Networks

    No full text
    In this paper, we describe a method for nonlinear fuzzy regression using neural network models. In earlier work, strong assumptions were made on the form of the fuzzy number parameters: symmetric triangular, asymmetric triangular, quadratic, trapezoidal, and so on. Our goal here is to substantially generalize both linear and nonlinear fuzzy regression using models with general fuzzy number inputs, weights, biases, and outputs. This is accomplished through a special training technique for fuzzy number neural networks. The technique is demonstrated with data from an industrial quality control problem

    Training Fuzzy Number Neural Networks with Alpha-cut Refinements

    No full text
    In a fuzzy number neural network, the inputs, weights, and outputs are general fuzzy numbers. The requirement that F¯α(1) ⊂F¯α(2) whenever α(1)\u3eα(2) imposes an enormous number of constraints on the weight parameterizations during training. This problem can be solved through a careful choice of weight representation. This new representation is unconstrained, so that standard neural network training techniques may be applied. Unfortunately, fuzzy number neural networks still have many parameters to pick during training, since each weight is represented by a vector. Thus moderate to large fuzzy number neural networks suffer from the usual maladies of very large neural networks. In this paper, we discuss a method for effectively reducing the dimensionality of networks during training. Each fuzzy number weight is represented by the endpoints of its α-cuts for some discretization 0⩽α1\u3cα2\u3c...\u3cαn ⩽1. To reduce dimensionality, training is first done using only a small subset of the αi. After successful training, linear interpolation is used to estimate additional α-cut endpoints. The network is then retrained to tune these interpolated values. This refinement is repeated as needed until the network is fully trained at the desired discretization in &alpha

    Fuzzy Probability for System Reliability

    No full text
    Fuzzy fault trees provide a powerful and computationally efficient technique for developing fuzzy probabilities based on independent inputs. The probability of any event that can be described in terms of a sequence of independent unions, intersections, and complements may be calculated by a fuzzy fault tree. Unfortunately, fuzzy fault trees do not provide a complete theory: many events of substantial practical interest cannot be described only by independent operations. In this paper, we introduce a new extension of crisp probability theory. Our model is based on n independent inputs, each with a fuzzy probability. The elements of our sample space describe exactly which of the n input events did and did not occur. Our extension is complete, since a fuzzy probability is assigned to every subset of the sample space. Our extension is also consistent with all calculations that can be arranged as a fault tree

    Fuzzy Number Neural Networks

    No full text
    This paper presents a practical algorithm for training neural networks with fuzzy number weights, inputs, and outputs. Typically, fuzzy number neural networks are difficult to train because of the many α-cut constraints implied by the fuzzy weights. A transformation is used to eliminate these constraints and allow use of standard unconstrained optimization methods. The algorithm is demonstrated on a three-layer network

    Training Fuzzy Number Neural Networks Using Constrained Backpropagation

    No full text
    Few training techniques are available for neural networks with fuzzy number weights, inputs, and outputs. Typically, fuzzy number neural networks are difficult to train because of the many α-cut constraints implied by the fuzzy weights. In this paper, we introduce a weight representation that simplifies the constraint equations. A constrained form of back propagation is then developed for fuzzy number neural networks. Standard backpropagation may be viewed as a constrained optimization of the linearization of the weight function. Our weight representation allows use of the additional α-cut constraints during a weight update

    A Theory of Independent Fuzzy Probability for System Reliability

    No full text
    Fuzzy fault trees provide a powerful and computationally efficient technique for developing fuzzy probabilities based on independent inputs. The probability of any event that can be described in terms of a sequence of independent unions, intersections, and complements may be calculated by a fuzzy fault tree. Unfortunately, fuzzy fault trees do not provide a complete theory: many events of substantial practical interest cannot be described only by independent operations. Thus, the standard fuzzy extension (based on fuzzy fault trees) is not complete since not all events are assigned a fuzzy probability. Other complete extensions have been proposed, but these extensions are not consistent with the calculations from fuzzy fault trees. In this paper, we propose a new extension of crisp probability theory. Our model is based on n independent inputs, each with a fuzzy probability. The elements of our sample space describe exactly which of the n input events did and did not occur. Our extension is complete since a fuzzy probability is assigned to every subset of the sample space. Our extension is also consistent with all calculations that can be arranged as a fault tree. Our approach allows the reliability analyst to develop complete and consistent fuzzy reliability models from existing crisp reliability models. This allows a comprehensive analysis of the system. Computational algorithms are provided both to extend existing models and develop new models. The technique is demonstrated on a reliability model of a three-stage industrial process
    corecore