2 research outputs found

    An analogue-domain, switch-capacitor-based arithmetic-logic unit

    No full text
    The continuous maturation of novel nanoelectronic devices exhibiting finely tuneable resistive switching is rekindling interest in analogue-domain computation. Regardless of domain, a useful computational module is the arithmetic-logic unit (ALU), which is capable of performing one or more fundamental mathematical operations (typical example: addition and subtraction). In this work we report on a design for an analogue ALU (aAL U) capable of performing barrel addition and subtraction (i.e. ADD/SUB in modular arithmetic). The circuit only requires 5 minimum-size transistors and 1 capacitor. We show that our aALU is in principle capable of handling 5 bits of information using a single input/output wire. Core power dissipation per operation is estimated to peak at ~ 59 f J (input operand-dependent) in TSMC's 65 nm technology.</p

    Physical reservoir computing with dynamical electronics

    Get PDF
    Since the advent of data-driven society, mass information generated from human activity and the natural environment has been collected, stored, processed, and then dispersed under conventional von Neumann architecture. However, further scaling the computing capability in terms of speed and power efficiency has been significantly slowed down in recent years due to the fundamental limits of transistors. To meet the increasingly demanding requirement for data-intensive computation, neuromorphic computing is a promising field taking the inspiration from the human brain, an extremely efficient biological computer, to develop unconventional computing paradigms for artificial intelligence. Reservoir computing, a recurrent neural network algorithm invented two decades ago, has received wide attention in the field of neuromorphic computing because of its unique recurrent dynamics and hardware-friendly implementation schemes. Under the concept of reservoir computing, hardware’s intrinsic physical behaviours can be explored as computing resources to keep the machine learning within the physical domain to improve processing efficiency, which is also known as physical reservoir computing. This thesis focuses on modelling and implementing physical reservoir computing based on dynamical electronics, along with its applications with sensory signals. First, the fundamental of the reservoir computing algorithm is introduced. Second, based on the reservoir algorithm and its functionalities, two different architectures for physically implementing reservoir computing, delay-based reservoir and parallel devices, are investigated to perform temporal signal processing. Thirdly, an efficient implementation architecture, namely rotating neurons reservoir, is developed. This novel architecture is evaluated in both theoretical analysis and experiments. An electrical prototype of the rotating neurons reservoir exhibits unique advantages such as resource-efficient implementation and low power consumption. More importantly, the theory of rotating neurons reservoir is highly universal, indicating that a rotational object embedded with dynamical elements can act as a reservoir computer
    corecore