15,250 research outputs found

    Learning Two-input Linear and Nonlinear Analog Functions with a Simple Chemical System

    Get PDF
    The current biochemical information processing systems behave in a predetermined manner because all features are defined during the design phase. To make such unconventional computing systems reusable and programmable for biomedical applications, adaptation, learning, and self-modification baaed on external stimuli would be highly desirable. However, so far, it haa been too challenging to implement these in real or simulated chemistries. In this paper we extend the chemical perceptron, a model previously proposed by the authors, to function as an analog instead of a binary system. The new analog asymmetric signal perceptron learns through feedback and supports MichaelisMenten kinetics. The results show that our perceptron is able to learn linear and nonlinear (quadratic) functions of two inputs. To the best of our knowledge, it is the first simulated chemical system capable of doing so. The small number of species and reactions allows for a mapping to an actual wet implementation using DNA-strand displacement or deoxyribozymes. Our results are an important step toward actual biochemical systems that can learn and adapt

    Memristors for the Curious Outsiders

    Full text link
    We present both an overview and a perspective of recent experimental advances and proposed new approaches to performing computation using memristors. A memristor is a 2-terminal passive component with a dynamic resistance depending on an internal parameter. We provide an brief historical introduction, as well as an overview over the physical mechanism that lead to memristive behavior. This review is meant to guide nonpractitioners in the field of memristive circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page

    Principles of Neuromorphic Photonics

    Full text link
    In an age overrun with information, the ability to process reams of data has become crucial. The demand for data will continue to grow as smart gadgets multiply and become increasingly integrated into our daily lives. Next-generation industries in artificial intelligence services and high-performance computing are so far supported by microelectronic platforms. These data-intensive enterprises rely on continual improvements in hardware. Their prospects are running up against a stark reality: conventional one-size-fits-all solutions offered by digital electronics can no longer satisfy this need, as Moore's law (exponential hardware scaling), interconnection density, and the von Neumann architecture reach their limits. With its superior speed and reconfigurability, analog photonics can provide some relief to these problems; however, complex applications of analog photonics have remained largely unexplored due to the absence of a robust photonic integration industry. Recently, the landscape for commercially-manufacturable photonic chips has been changing rapidly and now promises to achieve economies of scale previously enjoyed solely by microelectronics. The scientific community has set out to build bridges between the domains of photonic device physics and neural networks, giving rise to the field of \emph{neuromorphic photonics}. This article reviews the recent progress in integrated neuromorphic photonics. We provide an overview of neuromorphic computing, discuss the associated technology (microelectronic and photonic) platforms and compare their metric performance. We discuss photonic neural network approaches and challenges for integrated neuromorphic photonic processors while providing an in-depth description of photonic neurons and a candidate interconnection architecture. We conclude with a future outlook of neuro-inspired photonic processing.Comment: 28 pages, 19 figure

    A Digital Neuromorphic Architecture Efficiently Facilitating Complex Synaptic Response Functions Applied to Liquid State Machines

    Full text link
    Information in neural networks is represented as weighted connections, or synapses, between neurons. This poses a problem as the primary computational bottleneck for neural networks is the vector-matrix multiply when inputs are multiplied by the neural network weights. Conventional processing architectures are not well suited for simulating neural networks, often requiring large amounts of energy and time. Additionally, synapses in biological neural networks are not binary connections, but exhibit a nonlinear response function as neurotransmitters are emitted and diffuse between neurons. Inspired by neuroscience principles, we present a digital neuromorphic architecture, the Spiking Temporal Processing Unit (STPU), capable of modeling arbitrary complex synaptic response functions without requiring additional hardware components. We consider the paradigm of spiking neurons with temporally coded information as opposed to non-spiking rate coded neurons used in most neural networks. In this paradigm we examine liquid state machines applied to speech recognition and show how a liquid state machine with temporal dynamics maps onto the STPU-demonstrating the flexibility and efficiency of the STPU for instantiating neural algorithms.Comment: 8 pages, 4 Figures, Preprint of 2017 IJCN

    Predicting Health Impacts of the World Trade Center Disaster: 1. Halogenated hydrocarbons, symptom syndromes, secondary victimization, and the burdens of history

    Get PDF
    The recent attack on the World Trade Center, in addition to direct injury and psychological trauma, has exposed a vast population to dioxins, dibenzofurans, related endocrine disruptors, and a multitude of other physiologically active chemicals arising from the decomposition of the massive quantities of halogenated hydrocarbons and other plastics within the affected buildings. The impacts of these chemical species have been compounded by exposure to asbestos, fiberglass, crushed glass, concrete, plastic, and other irritating dusts. To address the manifold complexities of this incident we combine recent theoretical perspectives on immune, CNS, and sociocultural cognition with empirical studies on survivors of past large toxic fires, other community-scale chemical exposure incidents, and the aftereffects of war. Our analysis suggests the appearance of complex, but distinct and characteristic, spectra of synergistically linked social, psychosocial, psychological and physical symptoms among the 100,000 or so persons most directly affected by the WTC attack. The different 'eigenpatterns' should become increasingly comorbid as a function of exposure. The expected outcome greatly transcends a simple 'Post Traumatic Stress Disorder' model, and may resemble a particularly acute form of Gulf War Syndrome. We explore the role of external social factors in subsequent exacerbation of the syndrome -- secondary victimization -- and study the path-dependent influence of individual and community-level historical patterns of stress. We suggest that workplace and other organizations can act as ameliorating intermediaries. Those without acess to such buffering structures appear to face a particularly bleak future

    ART 2-A: An Adaptive Resonance Algorithm for Rapid Category Learning and Recognition

    Full text link
    This article introduces ART 2-A, an efficient algorithm that emulates the self-organizing pattern recognition and hypothesis testing properties of the ART 2 neural network architecture, but at a speed two to three orders of magnitude faster. Analysis and simulations show how the ART 2-A systems correspond to ART 2 dynamics at both the fast-learn limit and at intermediate learning rates. Intermediate learning rates permit fast commitment of category nodes but slow recoding, analogous to properties of word frequency effects, encoding specificity effects, and episodic memory. Better noise tolerance is hereby achieved without a loss of learning stability. The ART 2 and ART 2-A systems are contrasted with the leader algorithm. The speed of ART 2-A makes practical the use of ART 2 modules in large-scale neural computation.BP (89-A-1204); Defense Advanced Research Projects Agency (90-0083); National Science Foundation (IRI-90-00530); Air Force Office of Scientific Research (90-0175, 90-0128); Army Research Office (DAAL-03-88-K0088
    • 

    corecore