97 research outputs found
Computational Capacity and Energy Consumption of Complex Resistive Switch Networks
Resistive switches are a class of emerging nanoelectronics devices that
exhibit a wide variety of switching characteristics closely resembling
behaviors of biological synapses. Assembled into random networks, such
resistive switches produce emerging behaviors far more complex than that of
individual devices. This was previously demonstrated in simulations that
exploit information processing within these random networks to solve tasks that
require nonlinear computation as well as memory. Physical assemblies of such
networks manifest complex spatial structures and basic processing capabilities
often related to biologically-inspired computing. We model and simulate random
resistive switch networks and analyze their computational capacities. We
provide a detailed discussion of the relevant design parameters and establish
the link to the physical assemblies by relating the modeling parameters to
physical parameters. More globally connected networks and an increased network
switching activity are means to increase the computational capacity linearly at
the expense of exponentially growing energy consumption. We discuss a new
modular approach that exhibits higher computational capacities and energy
consumption growing linearly with the number of networks used. The results show
how to optimize the trade-off between computational capacity and energy
efficiency and are relevant for the design and fabrication of novel computing
architectures that harness random assemblies of emerging nanodevices
Searching for the physical nature of intelligence in Neuromorphic Nanowire Networks
The brainâs unique information processing efficiency has inspired the development of neuromorphic, or brain-inspired, hardware in effort to reduce the power consumption of conventional Artificial Intelligence (AI). One example of a neuromorphic system is nanowire networks (NWNs). NWNs have been shown to produce conductance pathways similar to neuro-synaptic pathways in the brain, demonstrating nonlinear dynamics, as well as emergent behaviours such as memory and learning. Their synapse-like electro-chemical junctions are connected by a heterogenous neural network-like structure. This makes NWNs a unique system for realising hardware-based machine intelligence that is potentially more brain-like than existing implementations of AI.
Much of the brainâs emergent properties are thought to arise from a unique structure-function relationship. The first part of the thesis establishes structural network characterisation methods in NWNs. Borrowing techniques from neuroscience, a toolkit is introduced for characterising network topology in NWNs. NWNs are found to display a âsmall-worldâ structure with highly modular connections, like simple biological systems.
Next, investigation of the structure-function link in NWNs occurs via implementation of machine learning benchmark tasks on varying network structures. Highly modular networks exhibit an ability to multitask, while integrated networks suffer from crosstalk interference.
Finally, above findings are combined to develop and implement neuroscience-inspired learning methods and tasks in NWNs. Specifically, an adaptation of a cognitive task that tests working memory in humans is implemented. Working memory and memory consolidation are demonstrated and found to be attributable to a process similar to synaptic metaplasticity in the brain.
The results of this thesis have created new research directions that warrant further exploration to test the universality of the physical nature of intelligence in inorganic systems beyond NWNs
Memristive and tunneling effects in 3D interconnected silver nanowires
Due to their memristive properties nanowire networks are very promising for
neuromorphic computing applications. Indeed, the resistance of such systems can
evolve with the input voltage or current as it confers a synaptic behaviour to
the device. Here, we propose a network of silver nanowires (Ag-NWs) which are
grown in a nanopourous membrane with interconnected nanopores by
electrodeposition. This bottom-up approach fabrication method gives a
conducting network with a 3D architecture and a high density of Ag-NWs. The
resulting 3D interconnected Ag-NW network exhibits a high initial resistance as
well as a memristive behavior. It is expected to arise from the creation and
the destruction of conducting silver filaments inside the Ag-NW network.
Moreover, after several cycles of measurement, the resistance of the network
switches from a high resistance regime, in the GOhm range, with a tunnel
conduction to a low resistance regime, in the kOhm range.Comment: 8 pages, 5 figure
Avalanches and the edge-of-chaos in neuromorphic nanowire networks
The brain's efficient information processing is enabled by the interplay between its neuro-synaptic elements and complex network structure. This work reports on the neuromorphic dynamics of nanowire networks (NWNs), a brain-inspired system with synapse-like memristive junctions embedded within a recurrent neural network-like structure. Simulation and experiment elucidate how collective memristive switching gives rise to long-range transport pathways, drastically altering the network's global state via a discontinuous phase transition. The spatio-temporal properties of switching dynamics are found to be consistent with avalanches displaying power-law size and life-time distributions, with exponents obeying the crackling noise relationship, thus satisfying criteria for criticality. Furthermore, NWNs adaptively respond to time varying stimuli, exhibiting diverse dynamics tunable from order to chaos. Dynamical states at the edge-of-chaos are found to optimise information processing for increasingly complex learning tasks. Overall, these results reveal a rich repertoire of emergent, collective dynamics in NWNs which may be harnessed in novel, brain-inspired computing approaches
Atomic Scale Dynamics Drive Brain-like Avalanches in Percolating Nanostructured Networks.
Self-assembled networks of nanoparticles and nanowires have recently emerged as promising systems for brain-like computation. Here, we focus on percolating networks of nanoparticles which exhibit brain-like dynamics. We use a combination of experiments and simulations to show that the brain-like network dynamics emerge from atomic-scale switching dynamics inside tunnel gaps that are distributed throughout the network. The atomic-scale dynamics emulate leaky integrate and fire (LIF) mechanisms in biological neurons, leading to the generation of critical avalanches of signals. These avalanches are quantitatively the same as those observed in cortical tissue and are signatures of the correlations that are required for computation. We show that the avalanches are associated with dynamical restructuring of the networks which self-tune to balanced states consistent with self-organized criticality. Our simulations allow visualization of the network states and detailed mechanisms of signal propagation
Three dimensional nanowire networks for reservoir computing.
Over the past few decades, machine learning has become integral to our daily lives.
Deep learning has revolutionized industry and scientific research, enabling us to solve
complex problems that were previously intractable. Similarly, as computer components
have become smaller and more efficient, humanity has gained unprecedented access
to affordable hardware. However, the looming fundamental limits on transistor sizes
have sparked widescale investigation into alternative means of computation that can
circumvent the restrictions imposed by conventional computer architecture. One such
method, called reservoir computing, maps sequential data onto a higher dimensional
space by using deep neural networks or nonlinear dynamical systems found in nature.
Networks of nanowires are currently under consideration for a wide range of electronic
and optoelectronic applications, and have recently been pursued as potential devices
for reservoir computing. Nanowire devices are usually made by sequential deposition,
which inevitably leads to the stacking of wires on top of one another. This thesis builds a
fully three dimensional simulation of a nanowire network and demonstrates the effect of
stacking on the topology of the resulting networks. Perfectly 2D networks are compared
with quasi-3D networks, and both are compared to the corresponding Watts Strogatz
networks, which are standard benchmark systems. By investigating quantities such as
clustering, path length, modularity, and small-world propensity it is shown that the
connectivity of the quasi-3D networks is significantly different to that of the 2D networks.
This thesis also explores the effects of stacking on the performance in two reservoir
computing tasks: memory capacity and nonlinear transformation. After developing
a dynamical model that describes the connections between individual nanowires, a
comparison of reservoir computing performance is made between 2D and quasi-3D
networks. Most previous simulations use the signals from every wire in the network.
In this thesis an electrode configuration is used that is a more physically realistic
representation of nanowire networks. The result is that the two different network types
have a strikingly similar performance in reservoir computing tasks, which is surprising
given their radically different topologies. However, there also exist key differences:
for large numbers of wires the upper limit on the performance of the 3D networks is
significantly higher than in the 2D networks. In addition, the 3D networks appear
to be more resilient to changes in the input parameters, generalizing better to noisy
training data. Since previous literature suggests that topology plays an important role
in computing performance, these results may have important implications for future
applications of nanowire networks
- âŠ