369 research outputs found

    Memristor-Based Volistor Gates Compute Logic with Low Power Consumption

    Get PDF
    We introduce a novel volistor logic gate which uses voltage as input and resistance as output. Volistors rely on the diode-like behavior of rectifying memristors. We show how to realize the first logic level, counted from the input, of any Boolean function with volistor gates in a memristive crossbar network. Unlike stateful logic, there is no need to store the inputs as resistances, and computation is performed directly. The fan-in and fan-out of volistor gates are large and different from traditional memristor circuits. Compared to solely memristive stateful logic, a combination of volistors and stateful inhibition gates can significantly reduce the number of operations required to calculate arbitrary multi-output Boolean functions. The power consumption of volistor logic is computed and compared with the power consumption of stateful logic using the simulation results obtained by LTspice—when implemented in a 1 × 8 or an 8 × 1 crosspoint array, volistors consume significantly less power

    Memristive Computing

    Get PDF
    Memristive computing refers to the utilization of the memristor, the fourth fundamental passive circuit element, in computational tasks. The existence of the memristor was theoretically predicted in 1971 by Leon O. Chua, but experimentally validated only in 2008 by HP Labs. A memristor is essentially a nonvolatile nanoscale programmable resistor — indeed, memory resistor — whose resistance, or memristance to be precise, is changed by applying a voltage across, or current through, the device. Memristive computing is a new area of research, and many of its fundamental questions still remain open. For example, it is yet unclear which applications would benefit the most from the inherent nonlinear dynamics of memristors. In any case, these dynamics should be exploited to allow memristors to perform computation in a natural way instead of attempting to emulate existing technologies such as CMOS logic. Examples of such methods of computation presented in this thesis are memristive stateful logic operations, memristive multiplication based on the translinear principle, and the exploitation of nonlinear dynamics to construct chaotic memristive circuits. This thesis considers memristive computing at various levels of abstraction. The first part of the thesis analyses the physical properties and the current-voltage behaviour of a single device. The middle part presents memristor programming methods, and describes microcircuits for logic and analog operations. The final chapters discuss memristive computing in largescale applications. In particular, cellular neural networks, and associative memory architectures are proposed as applications that significantly benefit from memristive implementation. The work presents several new results on memristor modeling and programming, memristive logic, analog arithmetic operations on memristors, and applications of memristors. The main conclusion of this thesis is that memristive computing will be advantageous in large-scale, highly parallel mixed-mode processing architectures. This can be justified by the following two arguments. First, since processing can be performed directly within memristive memory architectures, the required circuitry, processing time, and possibly also power consumption can be reduced compared to a conventional CMOS implementation. Second, intrachip communication can be naturally implemented by a memristive crossbar structure.Siirretty Doriast

    Teaching Memory Circuit Elements via Experiment-Based Learning

    Full text link
    The class of memory circuit elements which comprises memristive, memcapacitive, and meminductive systems, is gaining considerable attention in a broad range of disciplines. This is due to the enormous flexibility these elements provide in solving diverse problems in analog/neuromorphic and digital/quantum computation; the possibility to use them in an integrated computing-memory paradigm, massively-parallel solution of different optimization problems, learning, neural networks, etc. The time is therefore ripe to introduce these elements to the next generation of physicists and engineers with appropriate teaching tools that can be easily implemented in undergraduate teaching laboratories. In this paper, we suggest the use of easy-to-build emulators to provide a hands-on experience for the students to learn the fundamental properties and realize several applications of these memelements. We provide explicit examples of problems that could be tackled with these emulators that range in difficulty from the demonstration of the basic properties of memristive, memcapacitive, and meminductive systems to logic/computation and cross-bar memory. The emulators can be built from off-the-shelf components, with a total cost of a few tens of dollars, thus providing a relatively inexpensive platform for the implementation of these exercises in the classroom. We anticipate that this experiment-based learning can be easily adopted and expanded by the instructors with many more case studies.Comment: IEEE Circuits and Systems Magazine (in press

    Neuromorphic, Digital and Quantum Computation with Memory Circuit Elements

    Full text link
    Memory effects are ubiquitous in nature and the class of memory circuit elements - which includes memristors, memcapacitors and meminductors - shows great potential to understand and simulate the associated fundamental physical processes. Here, we show that such elements can also be used in electronic schemes mimicking biologically-inspired computer architectures, performing digital logic and arithmetic operations, and can expand the capabilities of certain quantum computation schemes. In particular, we will discuss few examples where the concept of memory elements is relevant to the realization of associative memory in neuronal circuits, spike-timing-dependent plasticity of synapses, digital and field-programmable quantum computing

    Neural Mechanism of Language

    Get PDF
    This paper is based on our previous work on neural coding. It is a self-organized model supported by existing evidences. Firstly, we briefly introduce this model in this paper, and then we explain the neural mechanism of language and reasoning with it. Moreover, we find that the position of an area determines its importance. Specifically, language relevant areas are in the capital position of the cortical kingdom. Therefore they are closely related with autonomous consciousness and working memories. In essence, language is a miniature of the real world. Briefly, this paper would like to bridge the gap between molecule mechanism of neurons and advanced functions such as language and reasoning.Comment: 6 pages, 3 figure

    Computers from plants we never made. Speculations

    Full text link
    We discuss possible designs and prototypes of computing systems that could be based on morphological development of roots, interaction of roots, and analog electrical computation with plants, and plant-derived electronic components. In morphological plant processors data are represented by initial configuration of roots and configurations of sources of attractants and repellents; results of computation are represented by topology of the roots' network. Computation is implemented by the roots following gradients of attractants and repellents, as well as interacting with each other. Problems solvable by plant roots, in principle, include shortest-path, minimum spanning tree, Voronoi diagram, α\alpha-shapes, convex subdivision of concave polygons. Electrical properties of plants can be modified by loading the plants with functional nanoparticles or coating parts of plants of conductive polymers. Thus, we are in position to make living variable resistors, capacitors, operational amplifiers, multipliers, potentiometers and fixed-function generators. The electrically modified plants can implement summation, integration with respect to time, inversion, multiplication, exponentiation, logarithm, division. Mathematical and engineering problems to be solved can be represented in plant root networks of resistive or reaction elements. Developments in plant-based computing architectures will trigger emergence of a unique community of biologists, electronic engineering and computer scientists working together to produce living electronic devices which future green computers will be made of.Comment: The chapter will be published in "Inspired by Nature. Computing inspired by physics, chemistry and biology. Essays presented to Julian Miller on the occasion of his 60th birthday", Editors: Susan Stepney and Andrew Adamatzky (Springer, 2017

    New Approaches for Memristive Logic Computations

    Get PDF
    Over the past five decades, exponential advances in device integration in microelectronics for memory and computation applications have been observed. These advances are closely related to miniaturization in integrated circuit technologies. However, this miniaturization is reaching the physical limit (i.e., the end of Moore\u27s Law). This miniaturization is also causing a dramatic problem of heat dissipation in integrated circuits. Additionally, approaching the physical limit of semiconductor devices in fabrication process increases the delay of moving data between computing and memory units hence decreasing the performance. The market requirements for faster computers with lower power consumption can be addressed by new emerging technologies such as memristors. Memristors are non-volatile and nanoscale devices and can be used for building memory arrays with very high density (extending Moore\u27s law). Memristors can also be used to perform stateful logic operations where the same devices are used for logic and memory, enabling in-memory logic. In other words, memristor-based stateful logic enables a new computing paradigm of combining calculation and memory units (versus von Neumann architecture of separating calculation and memory units). This reduces the delays between processor and memory by eliminating redundant reloading of reusable values. In addition, memristors consume low power hence can decrease the large amounts of power dissipation in silicon chips hitting their size limit. The primary focus of this research is to develop the circuit implementations for logic computations based on memristors. These implementations significantly improve the performance and decrease the power of digital circuits. This dissertation demonstrates in-memory computing using novel memristive logic gates, which we call volistors (voltage-resistor gates). Volistors capitalize on rectifying memristors, i.e., a type of memristors with diode-like behavior, and use voltage at input and resistance at output. In addition, programmable diode gates, i.e., another type of logic gates implemented with rectifying memristors are proposed. In programmable diode gates, memristors are used only as switches (unlike volistor gates which utilize both memory and switching characteristics of the memristors). The programmable diode gates can be used with CMOS gates to increase the logic density. As an example, a circuit implementation for calculating logic functions in generalized ESOP (Exclusive-OR-Sum-of-Products) form and multilevel XOR network are described. As opposed to the stateful logic gates, a combination of both proposed logic styles decreases the power and improves the performance of digital circuits realizing two-level logic functions Sum-of-Products or Product-of-Sums. This dissertation also proposes a general 3-dimentional circuit architecture for in-memory computing. This circuit consists of a number of stacked crossbar arrays which all can simultaneously be used for logic computing. These arrays communicate through CMOS peripheral circuits
    • …
    corecore