118 research outputs found
Teaching Memory Circuit Elements via Experiment-Based Learning
The class of memory circuit elements which comprises memristive,
memcapacitive, and meminductive systems, is gaining considerable attention in a
broad range of disciplines. This is due to the enormous flexibility these
elements provide in solving diverse problems in analog/neuromorphic and
digital/quantum computation; the possibility to use them in an integrated
computing-memory paradigm, massively-parallel solution of different
optimization problems, learning, neural networks, etc. The time is therefore
ripe to introduce these elements to the next generation of physicists and
engineers with appropriate teaching tools that can be easily implemented in
undergraduate teaching laboratories. In this paper, we suggest the use of
easy-to-build emulators to provide a hands-on experience for the students to
learn the fundamental properties and realize several applications of these
memelements. We provide explicit examples of problems that could be tackled
with these emulators that range in difficulty from the demonstration of the
basic properties of memristive, memcapacitive, and meminductive systems to
logic/computation and cross-bar memory. The emulators can be built from
off-the-shelf components, with a total cost of a few tens of dollars, thus
providing a relatively inexpensive platform for the implementation of these
exercises in the classroom. We anticipate that this experiment-based learning
can be easily adopted and expanded by the instructors with many more case
studies.Comment: IEEE Circuits and Systems Magazine (in press
Memristors for the Curious Outsiders
We present both an overview and a perspective of recent experimental advances
and proposed new approaches to performing computation using memristors. A
memristor is a 2-terminal passive component with a dynamic resistance depending
on an internal parameter. We provide an brief historical introduction, as well
as an overview over the physical mechanism that lead to memristive behavior.
This review is meant to guide nonpractitioners in the field of memristive
circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page
Synaptic Behavior in Metal Oxide-Based Memristors
With the end of Moore’s law in sight, new computing paradigms are needed to fulfill the increasing demands on data and processing potentials. Inspired by the operation of the human brain, from the dimensionality, energy and underlying functionalities, neuromorphic computing systems that are building upon circuit elements to mimic the neurobiological activities are good concepts to meet the challenge. As an important factor in a neuromorphic computer, electronic synapse has been intensively studied. The utilization of transistors, atomic switches and memristors has been proposed to perform synaptic functions. Memristors, with several unique properties, are exceptional candidates for emulating artificial synapses and thus for building artificial neural networks. In this paper, metal oxide-based memristor synapses are reviewed, from materials, properties, mechanisms, to architecture. The synaptic plasticity and learning rules are described. The electrical switching characteristics of a variety of metal oxide-based memristors are discussed, with a focus on their application as biological synapses
Analog Spiking Neuromorphic Circuits and Systems for Brain- and Nanotechnology-Inspired Cognitive Computing
Human society is now facing grand challenges to satisfy the growing demand for computing power, at the same time, sustain energy consumption. By the end of CMOS technology scaling, innovations are required to tackle the challenges in a radically different way. Inspired by the emerging understanding of the computing occurring in a brain and nanotechnology-enabled biological plausible synaptic plasticity, neuromorphic computing architectures are being investigated. Such a neuromorphic chip that combines CMOS analog spiking neurons and nanoscale resistive random-access memory (RRAM) using as electronics synapses can provide massive neural network parallelism, high density and online learning capability, and hence, paves the path towards a promising solution to future energy-efficient real-time computing systems. However, existing silicon neuron approaches are designed to faithfully reproduce biological neuron dynamics, and hence they are incompatible with the RRAM synapses, or require extensive peripheral circuitry to modulate a synapse, and are thus deficient in learning capability. As a result, they eliminate most of the density advantages gained by the adoption of nanoscale devices, and fail to realize a functional computing system.
This dissertation describes novel hardware architectures and neuron circuit designs that synergistically assemble the fundamental and significant elements for brain-inspired computing. Versatile CMOS spiking neurons that combine integrate-and-fire, passive dense RRAM synapses drive capability, dynamic biasing for adaptive power consumption, in situ spike-timing dependent plasticity (STDP) and competitive learning in compact integrated circuit modules are presented. Real-world pattern learning and recognition tasks using the proposed architecture were demonstrated with circuit-level simulations. A test chip was implemented and fabricated to verify the proposed CMOS neuron and hardware architecture, and the subsequent chip measurement results successfully proved the idea.
The work described in this dissertation realizes a key building block for large-scale integration of spiking neural network hardware, and then, serves as a step-stone for the building of next-generation energy-efficient brain-inspired cognitive computing systems
The Department of Electrical and Computer Engineering Newsletter
Summer 2016
News and notes for University of Dayton\u27s Department of Electrical and Computer Engineering.https://ecommons.udayton.edu/ece_newsletter/1009/thumbnail.jp
Recommended from our members
Organic electronics for neuromorphic computing
Neuromorphic computing could address the inherent limitations of conventional silicon technology in dedicated machine learning applications. Recent work on silicon-based asynchronous spiking neural networks and large crossbar-arrays of two-terminal memristive devices has led to the development of promising neuromorphic systems. However, delivering a compact and efficient parallel computing technology, such as artificial neural networks embedded in hardware, remains a significant challenge. Organic electronic materials offer an attractive alternative for such systems and could provide biocompatible and relatively inexpensive neuromorphic devices with low-energy switching and excellent tunability. Here, we review the development of organic neuromorphic devices. We consider different resistance switching mechanisms, which typically rely on electrochemical doping or charge trapping, and discuss the challenges the field faces in implementing low power neuromorphic computing, which include device downscaling, improving device speed, state retention and array compatibility. We highlight early demonstrations of device integration into arrays and finally consider future directions and potential applications of this technology
Memristor based neural networks: Feasibility, theories and approaches
Memristor-based neural networks refer to the utilisation of memristors, the newly emerged nanoscale devices, in building neural networks.
The memristor was first postulated by Leon Chua in 1971 as the fourth fundamental passive circuit element and experimentally validated by one of HP labs in 2008. Memristors, short for memory-resistor, have a peculiar memory effect which distinguishes them from resistors. By applying a bias voltage across it, the resistance of a memristor, namely memristance, is changed. In addition, the memristance is retained when the power supply is removed which demonstrates the non-volatility of the memristor.
Memristor-based neural networks are currently being researched in order to replace complementary metal-oxide-semiconductor (CMOS) devices in neuromorphic circuits with memristors and to investigate their potential applications. Current research primarily focuses on the utilisation of memristors as synaptic connections between neurons, however in any application it may be possible to allow memristors to perform computation in a natural way which attempts to avoid additional CMOS devices. Examples of such methods utilised in neural networks are presented in this thesis, such as memristor-based cellular neural network (CNN) structures, the memristive spiking-time dependent plasticity (STDP) model and the exploration of their potential applications.
This thesis presents manifold studies in the topic of memristor-based neural networks from theories and feasibility to approaches to implementations. Studies are divided into two parts which are the utilisation of memristors in non-spiking neural networks and spiking neural networks (SNNs). At the beginning of the thesis, fundamentals of neural networks and memristors are explored with the analysis of the physical properties and behaviour of memristors. In the studies of memristor-based non-spiking neural networks, a staircase memristor model is presented based on memristors which have multi-level resistive states and the delayed-switching effect. This model is adapted to CNNs and echo state networks (ESNs) as applications that benefit from memristive implementations. In the studies of memristor-based SNNs, a trace-based memristive STDP model is proposed and discussed to overcome the incompatibility issues of the previous model with all-to-all spike interaction. The work also presents applications of the trace-based memristive model in associative learning with retention loss and supervised learning.
The computational results of experiments with different applications have shown that memristor-based neural networks will be advantageous in building synchronous or asynchronous parallel neuromorphic systems. The work presents several new findings on memristor modelling, memristor-based neural network structures and memristor-based associative learning. These studies address unexplored research areas in the context of memristor-based neural networks to the best of our knowledge, and therefore form original contributions
Bio-inspired Neuromorphic Computing Using Memristor Crossbar Networks
Bio-inspired neuromorphic computing systems built with emerging devices such as memristors have become an active research field. Experimental demonstrations at the network-level have suggested memristor-based neuromorphic systems as a promising candidate to overcome the von-Neumann bottleneck in future computing applications. As a hardware system that offers co-location of memory and data processing, memristor-based networks represent an efficient computing platform with minimal data transfer and high parallelism. Furthermore, active utilization of the dynamic processes during resistive switching in memristors can help realize more faithful emulation of biological device and network behaviors, with the potential to process dynamic temporal inputs efficiently.
In this thesis, I present experimental demonstrations of neuromorphic systems using fabricated memristor arrays as well as network-level simulation results. Models of resistive switching behavior in two types of memristor devices, conventional first-order and recently proposed second-order memristor devices, will be first introduced. Secondly, experimental demonstration of K-means clustering through unsupervised learning in a memristor network will be presented. The memristor based hardware systems achieved high classification accuracy (93.3%) on the standard IRIS data set, suggesting practical networks can be built with optimized memristor devices. Thirdly, implementation of a partial differential equation (PDE) solver in memristor arrays will be discussed. This work expands the capability of memristor-based computing hardware from ‘soft’ to ‘hard’ computing tasks, which require very high precision and accurate solutions. In general first-order memristors are suitable to perform tasks that are based on vector-matrix multiplications, ranging from K-means clustering to PDE solvers. On the other hand, utilizing internal device dynamics in second-order memristors can allow natural emulation of biological behaviors and enable network functions such as temporal data processing. An effort to explore second-order memristor devices and their network behaviors will be discussed. Finally, we propose ideas to build large-size passive memristor crossbar arrays, including fabrication approaches, guidelines of device structure, and analysis of the parasitic effects in larger arrays.PHDElectrical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/147610/1/yjjeong_1.pd
- …