357 research outputs found

    Linear and nonlinear approaches to unravel dynamics and connectivity in neuronal cultures

    Get PDF
    [eng] In the present thesis, we propose to explore neuronal circuits at the mesoscale, an approach in which one monitors small populations of few thousand neurons and concentrates in the emergence of collective behavior. In our case, we carried out such an exploration both experimentally and numerically, and by adopting an analysis perspective centered on time series analysis and dynamical systems. Experimentally, we used neuronal cultures and prepared more than 200 of them, which were monitored using fluorescence calcium imaging. By adjusting the experimental conditions, we could set two basic arrangements of neurons, namely homogeneous and aggregated. In the experiments, we carried out two major explorations, namely development and disintegration. In the former we investigated changes in network behavior as it matured; in the latter we applied a drug that reduced neuronal interconnectivity. All the subsequent analyses and modeling along the thesis are based on these experimental data. Numerically, the thesis comprised two aspects. The first one was oriented towards a simulation of neuronal connectivity and dynamics. The second one was oriented towards the development of linear and nonlinear analysis tools to unravel dynamic and connectivity aspects of the measured experimental networks. For the first aspect, we developed a sophisticated software package to simulate single neuronal dynamics using a quadratic integrate–and–fire model with adaptation and depression. This model was plug into a synthetic graph in which the nodes of the network are neurons, and the edges connections. The graph was created using spatial embedding and realistic biology. We carried out hundreds of simulations in which we tuned the density of neurons, their spatial arrangement and the characteristics of the fluorescence signal. As a key result, we observed that homogeneous networks required a substantial number of neurons to fire and exhibit collective dynamics, and that the presence of aggregation significantly reduced the number of required neurons. For the second aspect, data analysis, we analyzed experiments and simulations to tackle three major aspects: network dynamics reconstruction using linear descriptions, dynamics reconstruction using nonlinear descriptors, and the assessment of neuronal connectivity from solely activity data. For the linear study, we analyzed all experiments using the power spectrum density (PSD), and observed that it was sufficiently good to describe the development of the network or its disintegration. PSD also allowed us to distinguish between healthy and unhealthy networks, and revealed dynamical heterogeneities across the network. For the nonlinear study, we used techniques in the context of recurrence plots. We first characterized the embedding dimension m and the time delay δ for each experiment, built the respective recurrence plots, and extracted key information of the dynamics of the system through different descriptors. Experimental results were contrasted with numerical simulations. After analyzing about 400 time series, we concluded that the degree of dynamical complexity in neuronal cultures changes both during development and disintegration. We also observed that the healthier the culture, the higher its dynamic complexity. Finally, for the reconstruction study, we first used numerical simulations to determine the best measure of ‘statistical interdependence’ among any two neurons, and took Generalized Transfer Entropy. We then analyzed the experimental data. We concluded that young cultures have a weak connectivity that increases along maturation. Aggregation increases average connectivity, and more interesting, also the assortativity, i.e. the tendency of highly connected nodes to connect with other highly connected node. In turn, this assortativity may delineates important aspects of the dynamics of the network. Overall, the results show that spatial arrangement and neuronal dynamics are able to shape a very rich repertoire of dynamical states of varying complexity.[cat] L’habilitat dels teixits neuronals de processar i transmetre informació de forma eficient depèn de les propietats dinàmiques intrínseques de les neurones i de la connectivitat entre elles. La present tesi proposa explorar diferents tècniques experimentals i de simulació per analitzar la dinàmica i connectivitat de xarxes neuronals corticals de rata embrionària. Experimentalment, la gravació de l’activitat espontània d’una població de neurones en cultiu, mitjançant una càmera ràpida i tècniques de fluorescència, possibilita el seguiment de forma controlada de l’activitat individual de cada neurona, així com la modificació de la seva connectivitat. En conjunt, aquestes eines permeten estudiar el comportament col.lectiu emergent de la població neuronal. Amb l’objectiu de simular els patrons observats en el laboratori, hem implementat un model mètric aleatori de creixement neuronal per simular la xarxa física de connexions entre neurones, i un model quadràtic d’integració i dispar amb adaptació i depressió per modelar l’ampli espectre de dinàmiques neuronals amb un cost computacional reduït. Hem caracteritzat la dinàmica global i individual de les neurones i l’hem correlacionat amb la seva estructura subjacent mitjançant tècniques lineals i no–lineals de series temporals. L’anàlisi espectral ens ha possibilitat la descripció del desenvolupament i els canvis en connectivitat en els cultius, així com la diferenciació entre cultius sans dels patològics. La reconstrucció de la dinàmica subjacent mitjançant mètodes d’incrustació i l’ús de gràfics de recurrència ens ha permès detectar diferents transicions dinàmiques amb el corresponent guany o pèrdua de la complexitat i riquesa dinàmica del cultiu durant els diferents estudis experimentals. Finalment, a fi de reconstruir la connectivitat interna hem testejat, mitjançant simulacions, diferents quantificadors per mesurar la dependència estadística entre neurona i neurona, seleccionant finalment el mètode de transferència d’entropia gereralitzada. Seguidament, hem procedit a caracteritzar les xarxes amb diferents paràmetres. Malgrat presentar certs tres de xarxes tipus ‘petit món’, els nostres cultius mostren una distribució de grau ‘exponencial’ o ‘esbiaixada’ per, respectivament, cultius joves i madurs. Addicionalment, hem observat que les xarxes homogènies presenten la propietat de disassortativitat, mentre que xarxes amb un creixent nivell d’agregació espaial presenten assortativitat. Aquesta propietat impacta fortament en la transmissió, resistència i sincronització de la xarxa

    Biologically inspired evolutionary temporal neural circuits

    Get PDF
    Biological neural networks have always motivated creation of new artificial neural networks, and in this case a new autonomous temporal neural network system. Among the more challenging problems of temporal neural networks are the design and incorporation of short and long-term memories as well as the choice of network topology and training mechanism. In general, delayed copies of network signals can form short-term memory (STM), providing a limited temporal history of events similar to FIR filters, whereas the synaptic connection strengths as well as delayed feedback loops (ER circuits) can constitute longer-term memories (LTM). This dissertation introduces a new general evolutionary temporal neural network framework (GETnet) through automatic design of arbitrary neural networks with STM and LTM. GETnet is a step towards realization of general intelligent systems that need minimum or no human intervention and can be applied to a broad range of problems. GETnet utilizes nonlinear moving average/autoregressive nodes and sub-circuits that are trained by enhanced gradient descent and evolutionary search in terms of architecture, synaptic delay, and synaptic weight spaces. The mixture of Lamarckian and Darwinian evolutionary mechanisms facilitates the Baldwin effect and speeds up the hybrid training. The ability to evolve arbitrary adaptive time-delay connections enables GETnet to find novel answers to many classification and system identification tasks expressed in the general form of desired multidimensional input and output signals. Simulations using Mackey-Glass chaotic time series and fingerprint perspiration-induced temporal variations are given to demonstrate the above stated capabilities of GETnet

    Dynamical Systems in Spiking Neuromorphic Hardware

    Get PDF
    Dynamical systems are universal computers. They can perceive stimuli, remember, learn from feedback, plan sequences of actions, and coordinate complex behavioural responses. The Neural Engineering Framework (NEF) provides a general recipe to formulate models of such systems as coupled sets of nonlinear differential equations and compile them onto recurrently connected spiking neural networks – akin to a programming language for spiking models of computation. The Nengo software ecosystem supports the NEF and compiles such models onto neuromorphic hardware. In this thesis, we analyze the theory driving the success of the NEF, and expose several core principles underpinning its correctness, scalability, completeness, robustness, and extensibility. We also derive novel theoretical extensions to the framework that enable it to far more effectively leverage a wide variety of dynamics in digital hardware, and to exploit the device-level physics in analog hardware. At the same time, we propose a novel set of spiking algorithms that recruit an optimal nonlinear encoding of time, which we call the Delay Network (DN). Backpropagation across stacked layers of DNs dramatically outperforms stacked Long Short-Term Memory (LSTM) networks—a state-of-the-art deep recurrent architecture—in accuracy and training time, on a continuous-time memory task, and a chaotic time-series prediction benchmark. The basic component of this network is shown to function on state-of-the-art spiking neuromorphic hardware including Braindrop and Loihi. This implementation approaches the energy-efficiency of the human brain in the former case, and the precision of conventional computation in the latter case

    Slow dynamics in structured neural network models

    Get PDF
    Humans and some other animals are able to perform tasks that require coordination of movements across multiple temporal scales, ranging from hundreds of milliseconds to several seconds. The fast timescale at which neurons naturally operate, on the order of tens of milliseconds, is well-suited to support motor control of rapid movements. In contrast, to coordinate movements on the order of seconds, a neural network should produce reliable dynamics on a similarly âslowâ timescale. Neurons and synapses exhibit biophysical mechanisms whose timescales range from tens of milliseconds to hours, which suggests a possible role of these mechanisms in producing slow reliable dynamics. However, how such mechanisms influence network dynamics is not yet understood. An alternative approach to achieve slow dynamics in a neural network consists in modifying its connectivity structure. Still, the limitations of this approach and in particular to what degree the weights require fine-tuning, remain unclear. Understanding how both the single neuron mechanisms and the connectivity structure might influence the network dynamics to produce slow timescales is the main goal of this thesis. We first consider the possibility of obtaining slow dynamics in binary networks by tuning their connectivity. It is known that binary networks can produce sequential dynamics. However, if the sequences consist of random patterns, the typical length of the longest sequence that can be produced grows linearly with the number of units. Here, we show that we can overcome this limitation by carefully designing the sequence structure. More precisely, we obtain a constructive proof that allows to obtain sequences whose length scales exponentially with the number of units. To achieve this however, one needs to exponentially fine-tune the connectivity matrix. Next, we focus on the interaction between single neuron mechanisms and recurrent dynamics. Particular attention is dedicated to adaptation, which is known to have a broad range of timescales and is therefore particularly interesting for the subject of this thesis. We study the dynamics of a random network with adaptation using mean-field techniques, and we show that the network can enter a state of resonant chaos. Interestingly, the resonance frequency of this state is independent of the connectivity strength and depends only on the properties of the single neuron model. The approach used to study networks with adaptation can also be applied when considering linear rate units with an arbitrary number of auxiliary variables. Based on a qualitative analysis of the mean-field theory for a random network whose neurons are described by a D -dimensional rate model, we conclude that the statistics of the chaotic dynamics are strongly influenced by the single neuron model under investigation. Using a reservoir computing approach, we show preliminary evidence that slow adaptation can be beneficial when performing tasks that require slow timescales. The positive impact of adaptation on the network performance is particularly strong in the presence of noise. Finally, we propose a network architecture in which the slowing-down effect due to adaptation is combined with a hierarchical structure, with the purpose of efficiently generate sequences that require multiple, hierarchically organized timescales

    Applying the Free-Energy Principle to Complex Adaptive Systems

    Get PDF
    The free energy principle is a mathematical theory of the behaviour of self-organising systems that originally gained prominence as a unified model of the brain. Since then, the theory has been applied to a plethora of biological phenomena, extending from single-celled and multicellular organisms through to niche construction and human culture, and even the emergence of life itself. The free energy principle tells us that perception and action operate synergistically to minimize an organism’s exposure to surprising biological states, which are more likely to lead to decay. A key corollary of this hypothesis is active inference—the idea that all behavior involves the selective sampling of sensory data so that we experience what we expect to (in order to avoid surprises). Simply put, we act upon the world to fulfill our expectations. It is now widely recognized that the implications of the free energy principle for our understanding of the human mind and behavior are far-reaching and profound. To date, however, its capacity to extend beyond our brain—to more generally explain living and other complex adaptive systems—has only just begun to be explored. The aim of this collection is to showcase the breadth of the free energy principle as a unified theory of complex adaptive systems—conscious, social, living, or not

    Adaptive networks for robotics and the emergence of reward anticipatory circuits

    Get PDF
    Currently the central challenge facing evolutionary robotics is to determine how best to extend the range and complexity of behaviour supported by evolved neural systems. Implicit in the work described in this thesis is the idea that this might best be achieved through devising neural circuits (tractable to evolutionary exploration) that exhibit complementary functional characteristics. We concentrate on two problem domains; locomotion and sequence learning. For locomotion we compare the use of GasNets and other adaptive networks. For sequence learning we introduce a novel connectionist model inspired by the role of dopamine in the basal ganglia (commonly interpreted as a form of reinforcement learning). This connectionist approach relies upon a new neuron model inspired by notions of energy efficient signalling. Two reward adaptive circuit variants were investigated. These were applied respectively to two learning problems; where action sequences are required to take place in a strict order, and secondly, where action sequences are robust to intermediate arbitrary states. We conclude the thesis by proposing a formal model of functional integration, encompassing locomotion and sequence learning, extending ideas proposed by W. Ross Ashby. A general model of the adaptive replicator is presented, incoporating subsystems that are tuned to continuous variation and discrete or conditional events. Comparisons are made with Ross W. Ashby's model of ultrastability and his ideas on adaptive behaviour. This model is intended to support our assertion that, GasNets (and similar networks) and reward adaptive circuits of the type presented here, are intrinsically complementary. In conclusion we present some ideas on how the co-evolution of GasNet and reward adaptive circuits might lead us to significant improvements in the synthesis of agents capable of exhibiting complex adaptive behaviour

    Second Generation General System Theory: Perspectives in Philosophy and Approaches in Complex Systems

    Get PDF
    Following the classical work of Norbert Wiener, Ross Ashby, Ludwig von Bertalanffy and many others, the concept of System has been elaborated in different disciplinary fields, allowing interdisciplinary approaches in areas such as Physics, Biology, Chemistry, Cognitive Science, Economics, Engineering, Social Sciences, Mathematics, Medicine, Artificial Intelligence, and Philosophy. The new challenge of Complexity and Emergence has made the concept of System even more relevant to the study of problems with high contextuality. This Special Issue focuses on the nature of new problems arising from the study and modelling of complexity, their eventual common aspects, properties and approaches—already partially considered by different disciplines—as well as focusing on new, possibly unitary, theoretical frameworks. This Special Issue aims to introduce fresh impetus into systems research when the possible detection and correction of mistakes require the development of new knowledge. This book contains contributions presenting new approaches and results, problems and proposals. The context is an interdisciplinary framework dealing, in order, with electronic engineering problems; the problem of the observer; transdisciplinarity; problems of organised complexity; theoretical incompleteness; design of digital systems in a user-centred way; reaction networks as a framework for systems modelling; emergence of a stable system in reaction networks; emergence at the fundamental systems level; behavioural realization of memoryless functions
    corecore