38 research outputs found
Convergence of Neural Networks with a Class of Real Memristors with Rectifying Characteristics
The paper considers a neural network with a class of real extended memristors obtained via the parallel connection of an ideal memristor and a nonlinear resistor. The resistor has the same rectifying characteristic for the current as that used in relevant models in the literature to account for diode-like effects at the interface between the memristor metal and insulating material. The paper proves some fundamental results on the trajectory convergence of this class of real memristor neural networks under the assumption that the interconnection matrix satisfies some symmetry conditions. First of all, the paper shows that, while in the case of neural networks with ideal memristors, it is possible to explicitly find functions of the state variables that are invariants of motions, the same functions can be used as Lyapunov functions that decrease along the trajectories in the case of real memristors with rectifying characteristics. This fundamental property is then used to study convergence by means of a reduction-of-order technique in combination with a Lyapunov approach. The theoretical predictions are verified via numerical simulations, and the convergence results are illustrated via the applications of real memristor neural networks to the solution of some image processing tasks in real time
Convergence of Discrete-Time Cellular Neural Networks with Application to Image Processing
The paper considers a class of discrete-time cellular neural networks (DT-CNNs) obtained by applying Euler's discretization scheme to standard CNNs. Let T be the DT-CNN interconnection matrix which is defined by the feedback cloning template. The paper shows that a DT-CNN is convergent, i.e. each solution tends to an equilibrium point, when T is symmetric and, in the case where T + En is not positive-semidefinite, the step size of Euler's discretization scheme does not exceed a given bound (En is the n × n unit matrix). It is shown that two relevant properties hold as a consequence of the local and space-invariant interconnecting structure of a DT-CNN, namely: (1) the bound on the step size can be easily estimated via the elements of the DT-CNN feedback cloning template only; (2) the bound is independent of the DT-CNN dimension. These two properties make DT-CNNs very effective in view of computer simulations and for the practical applications to high-dimensional processing tasks. The obtained results are proved via Lyapunov approach and LaSalle's Invariance Principle in combination with some fundamental inequalities enjoyed by the projection operator on a convex set. The results are compared with previous ones in the literature on the convergence of DT-CNNs and also with those obtained for different neural network models as the Brain-State-in-a-Box model. Finally, the results on convergence are illustrated via the application to some relevant 2D and 1D DT-CNNs for image processing tasks
Collective phenomena in networks of spiking neurons with synaptic delays
A prominent feature of the dynamics of large neuronal networks are the synchrony- driven collective oscillations generated by the interplay between synaptic coupling and synaptic delays. This thesis investigates the emergence of delay-induced oscillations in networks of heterogeneous spiking neurons. Building on recent theoretical advances in exact mean field reductions for neuronal networks, this work explores the dynamics and bifurcations of an exact firing rate model with various forms of synaptic delays. In parallel, the results obtained using the novel firing rate model are compared with extensive numerical simulations of large networks of spiking neurons, which confirm the existence of numerous synchrony-based oscillatory states. Some of these states are novel and display complex forms of partial synchronization and collective chaos. Given the well-known limitation of traditional firing rate models to describe synchrony-based oscillations, previous studies greatly overlooked many of the oscillatory states found here. Therefore, this thesis provides a unique exploration of the oscillatory scenarios found in neuronal networks due to the presence of delays, and may substantially extend the mathematical tools available for modeling the plethora of oscillations detected in electrical recordings of brain activity
Dynamical complexity of large-scale neurocognitive networks in healthy and pathological brain states
Visual Cortex
The neurosciences have experienced tremendous and wonderful progress in many areas, and the spectrum encompassing the neurosciences is expansive. Suffice it to mention a few classical fields: electrophysiology, genetics, physics, computer sciences, and more recently, social and marketing neurosciences. Of course, this large growth resulted in the production of many books. Perhaps the visual system and the visual cortex were in the vanguard because most animals do not produce their own light and offer thus the invaluable advantage of allowing investigators to conduct experiments in full control of the stimulus. In addition, the fascinating evolution of scientific techniques, the immense productivity of recent research, and the ensuing literature make it virtually impossible to publish in a single volume all worthwhile work accomplished throughout the scientific world. The days when a single individual, as Diderot, could undertake the production of an encyclopedia are gone forever. Indeed most approaches to studying the nervous system are valid and neuroscientists produce an almost astronomical number of interesting data accompanied by extremely worthy hypotheses which in turn generate new ventures in search of brain functions. Yet, it is fully justified to make an encore and to publish a book dedicated to visual cortex and beyond. Many reasons validate a book assembling chapters written by active researchers. Each has the opportunity to bind together data and explore original ideas whose fate will not fall into the hands of uncompromising reviewers of traditional journals. This book focuses on the cerebral cortex with a large emphasis on vision. Yet it offers the reader diverse approaches employed to investigate the brain, for instance, computer simulation, cellular responses, or rivalry between various targets and goal directed actions. This volume thus covers a large spectrum of research even though it is impossible to include all topics in the extremely diverse field of neurosciences
Genetic determination and layout rules of visual cortical architecture
The functional architecture of the primary visual cortex is set up by neurons that preferentially respond to visual stimuli with contours of a specific orientation in visual space. In primates and placental carnivores, orientation preference is arranged into continuous and roughly repetitive (iso-) orientation domains. Exceptions are pinwheels that are surrounded by all orientation preferences. The configuration of pinwheels adheres to quantitative species-invariant statistics, the common design. This common design most likely evolved independently at least twice in the course of the past 65 million years, which might indicate a functionally advantageous trait. The possible acquisition of environment-dependent functional traits by genes, the Baldwin effect, makes it conceivable that visual cortical architecture is partially or redundantly encoded by genetic information. In this conception, genetic mechanisms support the emergence of visual cortical architecture or even establish it under unfavorable environments. In this dissertation, I examine the capability of genetic mechanisms for encoding visual cortical architecture and mathematically dissect the pinwheel configuration under measurement noise as well as in different geometries. First, I theoretically explore possible roles of genetic mechanisms in visual cortical development that were previously excluded from theoretical research, mostly because the information capacity of the genome appeared too small to contain a blueprint for wiring up the cortex. For the first time, I provide a biologically plausible scheme for quantitatively encoding functional visual cortical architecture by genetic information that circumvents the alleged information bottleneck. Key ingredients for this mechanism are active transport and trans-neuronal signaling as well as joined dynamics of morphogens and connectome. This theory provides predictions for experimental tests and thus may help to clarify the relative importance of genes and environments on complex human traits. Second, I disentangle the link between orientation domain ensembles and the species-invariant pinwheel statistics of the common design. This examination highlights informative measures of pinwheel configurations for model benchmarking. Third, I mathematically investigate the susceptibility of the pinwheel configuration to measurement noise. The results give rise to an extrapolation method of pinwheel densities to the zero noise limit and provide an approximated analytical expression for confidence regions of pinwheel centers. Thus, the work facilitates high-precision measurements and enhances benchmarking for devising more accurate models of visual cortical development. Finally, I shed light on genuine three-dimensional properties of functional visual cortical architectures. I devise maximum entropy models of three-dimensional functional visual cortical architectures in different geometries. This theory enables the examination of possible evolutionary transitions between different functional architectures for which intermediate organizations might still exist
Recommended from our members
Networked Dynamical Systems: Privacy, Control, and Cognition
Many natural and man-made systems, ranging from thenervous system to power and transportation grids to societies, exhibitdynamic behaviors that evolve over a sparse and complex network. This networked aspect raises significant challenges and opportunities for the identification, analysis, and control of such dynamic behaviors. While some of these challenges emanate from the networked aspect \emph{per se} (such as the sparsity of connections between system components and the interplay between nodal \emph{communication} and network dynamics), various challenges arise from the specific application areas (such as privacy concerns in cyber-physical systems or the need for \emph{scalable} algorithm designs due to the large size of various biological and engineered networks). On the other hand, networked systems provide significant opportunities and allow for performance and robustness levels that are far beyond reach for centralized systems, with examples ranging from the Internet (of Things) to the smart grid and the brain. This dissertation aims to address several of these challenges and harness these opportunities. The dissertation is divided into three parts. In the first part, we study privacy concerns whose resolution is vital for the utility of networked cyber-physical systems. We study the problems of average consensus and convex optimization as two principal distributed computations occurring over networks and design algorithm with rigorous privacy guarantees that provide a \emph{best achievable} tradeoff between network utility and privacy. In the second part, we analyze networks with resource constraints. More specifically, we study three problems of stabilization under communication (bandwidth and latency) limitations in sensing and actuation, optimal time-varying control scheduling problem under limited number of actuators and control energy, and the structure identification problem of under-sensed networks (i.e., networks with latent nodes). Finally in the last part, we focus on the intersection of networked dynamical systems and neuroscience and draw connections between brain network dynamics and two extensively studied but yet not fully understood neuro-cognitive phenomena: goal-driven selective attention and neural oscillations. Using a novel axiomatic approach, we establish these connections in the form of necessary and/or sufficient conditions on the network structure that match the network output trajectories with experimentally observed brain activity