10 research outputs found

    High-Speed CMOS-Free Purely Spintronic Asynchronous Recurrent Neural Network

    Full text link
    Neuromorphic computing systems overcome the limitations of traditional von Neumann computing architectures. These computing systems can be further improved upon by using emerging technologies that are more efficient than CMOS for neural computation. Recent research has demonstrated memristors and spintronic devices in various neural network designs boost efficiency and speed. This paper presents a biologically inspired fully spintronic neuron used in a fully spintronic Hopfield RNN. The network is used to solve tasks, and the results are compared against those of current Hopfield neuromorphic architectures which use emerging technologies

    A drive towards thermodynamic efficiency for dissipative structures in chemical reaction networks

    Get PDF
    Dissipative accounts of structure formation show that the self-organisation of complex structures is thermodynamically favoured, whenever these structures dissipate free energy that could not be accessed otherwise. These structures therefore open transition channels for the state of the universe to move from a frustrated, metastable state to another metastable state of higher entropy. However, these accounts apply as well to relatively simple, dissipative systems, such as convection cells, hurricanes, candle flames, lightning strikes, or mechanical cracks, as they do to complex biological systems. Conversely, interesting computational properties—that characterize complex biological systems, such as efficient, predictive representations of environmental dynamics— can be linked to the thermodynamic efficiency of underlying physical processes. However, the potential mechanisms that underwrite the selection of dissipative structures with thermodynamically efficient subprocesses is not completely understood. We address these mechanisms by explaining how bifurcation-based, work-harvesting processes—required to sustain complex dissipative structures— might be driven towards thermodynamic efficiency. We first demonstrate a simple mechanism that leads to self-selection of efficient dissipative structures in a stochastic chemical reaction network, when the dissipated driving chemical potential difference is decreased. We then discuss how such a drive can emerge naturally in a hierarchy of self-similar dissipative structures, each feeding on the dissipative structures of a previous level, when moving away from the initial, driving disequilibrium

    Brain-inspired methods for achieving robust computation in heterogeneous mixed-signal neuromorphic processing systems

    Get PDF
    Neuromorphic processing systems implementing spiking neural networks with mixed signal analog/digital electronic circuits and/or memristive devices represent a promising technology for edge computing applications that require low power, low latency, and that cannot connect to the cloud for off-line processing, either due to lack of connectivity or for privacy concerns. However, these circuits are typically noisy and imprecise, because they are affected by device-to-device variability, and operate with extremely small currents. So achieving reliable computation and high accuracy following this approach is still an open challenge that has hampered progress on the one hand and limited widespread adoption of this technology on the other. By construction, these hardware processing systems have many constraints that are biologically plausible, such as heterogeneity and non-negativity of parameters. More and more evidence is showing that applying such constraints to artificial neural networks, including those used in artificial intelligence, promotes robustness in learning and improves their reliability. Here we delve even more into neuroscience and present network-level brain-inspired strategies that further improve reliability and robustness in these neuromorphic systems: we quantify, with chip measurements, to what extent population averaging is effective in reducing variability in neural responses, we demonstrate experimentally how the neural coding strategies of cortical models allow silicon neurons to produce reliable signal representations, and show how to robustly implement essential computational primitives, such as selective amplification, signal restoration, working memory, and relational networks, exploiting such strategies. We argue that these strategies can be instrumental for guiding the design of robust and reliable ultra-low power electronic neural processing systems implemented using noisy and imprecise computing substrates such as subthreshold neuromorphic circuits and emerging memory technologies

    What can we know about that which we cannot even imagine?

    Full text link
    In this essay I will consider a sequence of questions. The first questions concern the biological function of intelligence in general, and cognitive prostheses of human intelligence in particular. These will lead into questions concerning human language, perhaps the most important cognitive prosthesis humanity has ever developed. While it is traditional to rhapsodize about the cognitive power encapsulated in human language, I will emphasize how horribly limited human language is -- and therefore how limited our cognitive abilities are, despite their being augmented with language. This will lead to questions of whether human mathematics, being ultimately formulated in terms of human language, is also deeply limited. I will then combine these questions to pose a partial, sort-of, sideways answer to the guiding concern of this essay: what we can ever discern about that we cannot even conceive?Comment: 38 pages, 9 pages are reference

    A Computational Study Of The Influence Of Cortical Processes On The Olfactory Bulb

    Get PDF
    The olfactory bulb sits at the crossroads of input from an animal’s external and internal world. In this neural structure, chemical information from the environment interacts with contextual information emanating from higher cortical regions to shape mental representations of odor. Nevertheless, the factors influencing this interaction, and how the cortex manipulates these factors to the advantage of the animal, remain a mystery. To investigate this question, we have developed a large-scale computational model of the olfactory bulb. This model consists of a new algorithm to determine connectivity between mitral cells and granule cells, based in known anatomical constraints, combined with a dynamical systems approach utilizing the Izhikevich equations to simulate the network’s behavior. Using this model, we first examine connectivity and activity patterns of our network to demonstrate the strong relationship between structure and function in the olfactory bulb. We then further employ this model to analyze the effects of centrifugal feedback to the olfactory bulb on cortical odor representations; through this analysis, we are able to show that stochastic feedback patterns can evoke distinct trends in convergence and divergence between these representations depending on cortical excitability. Finally, we take advantage of the ease of incorporating new neurons into the model to study neurogenesis in the olfactory bulb, in particular to elucidate possible rules governing the placement of new cells. Through these experiments, our model provides new insight into the olfactory bulb and its role in the greater olfactory system

    Models of Causal Inference in the Elasmobranch Electrosensory System: How Sharks Find Food

    Get PDF
    We develop a theory of how the functional design of the electrosensory system in sharks reflects the inevitability of noise in high-precision measurements, and how the Central Nervous System may have developed an efficient solution to the problem of inferring parameters of stimulus sources, such as their location, via Bayesian neural computation. We use Finite Element Method to examine how the electrical properties of shark tissues and the geometrical configuration of both the shark body and the electrosensory array, act to focus weak electric fields in the aquatic environment, so that the majority of the voltage drop is signalled across the electrosensory cells. We analyse snapshots of two ethologically relevant stimuli: localized prey-like dipole electric sources, and uniform electric fields resembling motion-induced and other fields encountered in the ocean. We demonstrated that self movement (or self state) not only affects the measured field, by perturbing the self field, but also affects the external field. Electrosensory cells provide input to central brain regions via primary afferent nerves. Inspection of elasmobranch electrosensory afferent spike trains and inter-spike interval distributions indicates that they typically have fairly regular spontaneous inter-spike intervals with skewed Gaussian-like variability. However, because electrosensory afferent neurons converge onto secondary neurons, we consider the convergent input a "super afferent" with the pulse train received by a target neuron approaching a Poisson process with shorter mean intervals as the number of independent convergent spike trains increases. We implement a spiking neural particle filter which takes simulated electrosensory "super afferent" spike trains and can successfully infer the fixed Poisson parameter, or the equivalent real world state, distance to a source. The circuit obtained by converting the mathematical model to a network structure bears a striking resemblance to the cerebellar-like hindbrain circuits of the dorsal octavolateral nucleus. The elasmobranchs’ ability to sense electric fields down to a limit imposed by thermodynamics seems extraordinary. However we predict that the theories presented here generalize to other sensory systems, particularly the other octavolateralis senses which share cerebellar-like circuitry, suggesting that the cerebellum itself also plays a role in dynamic state estimation

    Characterising the role of the ascending arousal system in facilitating global brain dynamics in health and neurodegeneration

    Get PDF
    The inherent complexity of the brain can be attributed to countless interacting parts, from microcircuit detail scaled to large oscillatory fluctuations of macroscopic activity. One such structure that has previously been shown to influence dynamic macroscopic fluctuations in brain activity is the ascending arousal system. The ascending arousal system is comprised of multiple nuclei that send diffuse and broad-reaching neuromodulatory inputs across the brain – a function proposed to facilitate global brain activity changes. However, little is known about the exact mechanisms or extent of the influence of the ascending arousal system in facilitating large-scale brain dynamics. Hence, this thesis attempts to reveal the role of the underlying ascending arousal system in facilitating global brain dynamics that are critical for brain function. Specifically, this thesis unpacks the importance of considering the interactions between the cholinergic and noradrenergic system in facilitating global brain state dynamics. We reveal that the structural connections between the noradrenergic and cholinergic systems are critical in constraining global brain-state dynamics. We show a causal role of the cholinergic system in facilitating global brain-state dynamics and demonstrate a microcircuit mechanism of global brain-state dynamics. Next, we discuss the importance of viewing the brain through the lens of a complex system to understand both its function and dysfunction across neurodegenerative diseases. Then, we establish a maladaptive mechanism of the noradrenergic system in the manifestation of freezing of gait in Parkinson’s disease. Lastly, we examine the role of the noradrenergic system in other symptom manifestations in Parkinson’s disease. Ultimately, this thesis characterises the interactions of the cholinergic and noradrenergic systems in facilitating global brain-state dynamics in both healthy and diseased brains

    Heterogeneity and Efficiency in the Brain

    No full text
    corecore