24,061 research outputs found

    A Neural Network Realization of Fuzzy ART

    Full text link
    A neural network realization of the fuzzy Adaptive Resonance Theory (ART) algorithm is described. Fuzzy ART is capable of rapid stable learning of recognition categories in response to arbitrary sequences of analog or binary input patterns. Fuzzy ART incorporates computations from fuzzy set theory into the ART 1 neural network, which learns to categorize only binary input patterns, thus enabling the network to learn both analog and binary input patterns. In the neural network realization of fuzzy ART, signal transduction obeys a path capacity rule. Category choice is determined by a combination of bottom-up signals and learned category biases. Top-down signals impose upper bounds on feature node activations.British Petroleum (89-A-1204); Defense Advanced Research Projects Agency (90-0083); National Science Foundation (IRI 90-00530); Office of Naval Research (N00014-91-J-4100); Air Force Office of Scientific Research (90-0175

    dARTMAP: A Neural Network for Fast Distributed Supervised Learning

    Full text link
    Distributed coding at the hidden layer of a multi-layer perceptron (MLP) endows the network with memory compression and noise tolerance capabilities. However, an MLP typically requires slow off-line learning to avoid catastrophic forgetting in an open input environment. An adaptive resonance theory (ART) model is designed to guarantee stable memories even with fast on-line learning. However, ART stability typically requires winner-take-all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to combine the computational advantages of MLP and ART systems in a real-time neural network for supervised learning, An implementation algorithm here describes one class of dARTMAP networks. This system incorporates elements of the unsupervised dART model as well as new features, including a content-addressable memory (CAM) rule for improved contrast control at the coding field. A dARTMAP system reduces to fuzzy ARTMAP when coding is winner-take-all. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compression.National Science Foundation (IRI-94-01659); Office of Naval Research (N00014-95-1-0409, N00014-95-0657

    A symbolic sensor for an Antilock brake system of a commercial aircraft

    Get PDF
    The design of a symbolic sensor that identifies thecondition of the runway surface (dry, wet, icy, etc.) during the braking of a commercial aircraft is discussed. The purpose of such a sensor is to generate a qualitative, real-time information about the runway surface to be integrated into a future aircraft Antilock Braking System (ABS). It can be expected that this information can significantly improve the performance of ABS. For the design of the symbolic sensor different classification techniques based upon fuzzy set theory and neural networks are proposed. To develop and to verify theses classification algorithms data recorded from recent braking tests have been used. The results show that the symbolic sensor is able to correctly identify the surface condition. Overall, the application example considered in this paper demonstrates that symbolic information processing using fuzzy logic and neural networks has the potential to provide new functions in control system design. This paper is part of a common research project between E.N.S.I.C.A. and Aerospatiale in France to study the role of the fuzzy set theory for potential applications in future aircraft control systems

    Neuro-Fuzzy Computing System with the Capacity of Implementation on Memristor-Crossbar and Optimization-Free Hardware Training

    Full text link
    In this paper, first we present a new explanation for the relation between logical circuits and artificial neural networks, logical circuits and fuzzy logic, and artificial neural networks and fuzzy inference systems. Then, based on these results, we propose a new neuro-fuzzy computing system which can effectively be implemented on the memristor-crossbar structure. One important feature of the proposed system is that its hardware can directly be trained using the Hebbian learning rule and without the need to any optimization. The system also has a very good capability to deal with huge number of input-out training data without facing problems like overtraining.Comment: 16 pages, 11 images, submitted to IEEE Trans. on Fuzzy system

    A Distributed Outstar Network for Spatial Pattern Learning

    Full text link
    The distributed outstar, a generalization of the outstar neural network for spatial pattern learning, is introduced. In the outstar, signals from a source node cause weights to learn and recall arbitrary patterns across a target field of nodes. The distributed outstar replaces the outstar source node with a source field of arbitrarily many nodes, whose activity pattern may be arbitrarily distributed or compressed. Learning proceeds according to a principle of atrophy due to disuse, whereby a path weight decreases in joint proportion to the transmitted path signal and the degree of disuse of the target node. During learning, the total signal to a target node converges toward that node's activity level. Weight changes at a node are apportioned according to the distributed pattern of converging signals. Three synaptic transmission functions, by a product rule, a capacity rule, and a threshold rule, are examined for this system. The three rules are computationally equivalent when source field activity is maximally compressed, or winner-take-all. When source field activity is distributed, catastrophic forgetting may occur. Only the threshold rule solves this problem. Analysis of spatial pattern learning by distributed codes thereby leads to the conjecture that the unit of long-term memory in such a system is an adaptive threshold, rather than the multiplicative path weight widely used in neural models.British Petroleum (89-A-1204); Advanced Research Projects Agency (ONR N00014-92-J-4015); National Science Foundation (IRI-90-00530); Office of Naval Research (N00014-91-J-4100

    Distributed Activation, Search, and Learning by ART and ARTMAP Neural Networks

    Full text link
    Adaptive resonance theory (ART) models have been used for learning and prediction in a wide variety of applications. Winner-take-all coding allows these networks to maintain stable memories, but this type of code representation can cause problems such as category proliferation with fast learning and a noisy training set. A new class of ART models with an arbitrarily distributed code representation is outlined here. With winner-take-all coding, the unsupervised distributed ART model (dART) reduces to fuzzy ART and the supervised distributed ARTMAP model (dARTMAP) reduces to fuzzy ARTMAP. dART automatically apportions learned changes according to the degree of activation of each node, which permits fast as well as slow learning with compressed or distributed codes. Distributed ART models replace the traditional neural network path weight with a dynamic weight equal to the rectified difference between coding node activation and an adaptive threshold. Dynamic weights that project to coding nodes obey a distributed instar leaning law and those that originate from coding nodes obey a distributed outstar learning law. Inputs activate distributed codes through phasic and tonic signal components with dual computational properties, and a parallel distributed match-reset-search process helps stabilize memory.National Science Foundation (IRI 94-0 1659); Office of Naval Research (N00014-95-1-0409, N00014-95-0657

    Adaptive Resonance Theory

    Full text link
    SyNAPSE program of the Defense Advanced Projects Research Agency (Hewlett-Packard Company, subcontract under DARPA prime contract HR0011-09-3-0001, and HRL Laboratories LLC, subcontract #801881-BS under DARPA prime contract HR0011-09-C-0001); CELEST, an NSF Science of Learning Center (SBE-0354378
    • …
    corecore