104 research outputs found

    Dynamical systems applied to consciousness and brain rhythms in a neural network

    Get PDF
    This thesis applies the great advances of modern dynamical systems theory (DST) to consciousness. Consciousness, or subjective experience, is faced here in two different ways: from the global dynamics of the human brain and from the integrated information theory (IIT), one of the currently most prestigious theories on consciousness. Before that, a study of a numerical simulation of a network of individual neurons justifies the use of the Lotka-Volterra model for neurons assemblies in both applications. All these proposals are developed following this scheme: • First, summarizing the structure, methods and goal of the thesis. • Second, introducing a general background in neuroscience and the global dynamics of the human brain to better understand those applications. • Third, conducting a study of a numerically simulated network of neurons. This network, which displays brain rhythms, can be employed, among other objectives, to justify the use of the Lotka-Volterra model for applications. • Fourth, summarizing concepts from the mathematical DST such as the global attractor and its informational structure, in addition to its particularization to a Lotka-Volterra system. • Fifth, introducing the new mathematical concepts of model transform and instantaneous parameters that allow the application of simple mathematical models such as Lotka-Volterra to complex empirical systems as the human brain. • Sixth, using the model transform, and specifically the Lotka-Volterra transform, to calculate global attractors and informational structures in global dynamics of the human brain. • Seventh, knowing the probably most prestigious theory on consciousness, the IIT developed by G. Tononi. • Eighth, using informational structures to develop a continuous version of IIT. And ninth, establishing some final conclusions and commenting on new open questions from this work. These nine points of this scheme correspond to the nine chapters of this thesis

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA

    Informational structures: A dynamical system approach for integrated information

    Get PDF
    Integrated Information Theory (IIT) has become nowadays the most sensible general theory of consciousness. In addition to very important statements, it opens the door for an abstract (mathematical) formulation of the theory. Given a mechanism in a particular state, IIT identifies a conscious experience with a conceptual structure, an informational object which exists, is composed of identified parts, is informative, integrated and maximally irreducible. This paper introduces a space-time continuous version of the concept of integrated information. To this aim, a graph and a dynamical systems treatment is used to define, for a given mechanism in a state for which a dynamics is settled, an Informational Structure, which is associated to the global attractor at each time of the system. By definition, the informational structure determines all the past and future behavior of the system, possesses an informational nature and, moreover, enriches all the points of the phase space with cause-effect power by means of its associated Informational Field. A detailed description of its inner structure by invariants and connections between them allows to associate a transition probability matrix to each informational structure and to develop a measure for the level of integrated information of the system.Ministerio de Economía, Industria y CompetitividadJunta de AndalucíaFondo Europeo de Desarrollo Regiona

    Time delays and stimulus-dependent pattern formation in periodic environments in isolated neurons

    Get PDF
    The dynamical characteristics of a single isolated Hopfield-type neuron with dissipation and time-delayed self-interaction under periodic stimuli are studied. Sufficient conditions for the hetero-associative stable encoding of periodic external stimuli are obtained. Both discrete and continuously distributed delays are included

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Exponential multistability of memristive Cohen-Grossberg neural networks with stochastic parameter perturbations

    Get PDF
    © 2020 Elsevier Ltd. All rights reserved. This manuscript is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Licence http://creativecommons.org/licenses/by-nc-nd/4.0/.Due to instability being induced easily by parameter disturbances of network systems, this paper investigates the multistability of memristive Cohen-Grossberg neural networks (MCGNNs) under stochastic parameter perturbations. It is demonstrated that stable equilibrium points of MCGNNs can be flexibly located in the odd-sequence or even-sequence regions. Some sufficient conditions are derived to ensure the exponential multistability of MCGNNs under parameter perturbations. It is found that there exist at least (w+2) l (or (w+1) l) exponentially stable equilibrium points in the odd-sequence (or the even-sequence) regions. In the paper, two numerical examples are given to verify the correctness and effectiveness of the obtained results.Peer reviewe

    The Construction of Arbitrary Stable Dynamics in Non-Linear Neural Networks

    Full text link
    In this paper, two methods for constructing systems of ordinary differential equations realizing any fixed finite set of equilibria in any fixed finite dimension are introduced; no spurious equilibria are possible for either method. By using the first method, one can construct a system with the fewest number of equilibria, given a fixed set of attractors. Using a strict Lyapunov function for each of these differential equations, a large class of systems with the same set of equilibria is constructed. A method of fitting these nonlinear systems to trajectories is proposed. In addition, a general method which will produce an arbitrary number of periodic orbits of shapes of arbitrary complexity is also discussed. A more general second method is given to construct a differential equation which converges to a fixed given finite set of equilibria. This technique is much more general in that it allows this set of equilibria to have any of a large class of indices which are consistent with the Morse Inequalities. It is clear that this class is not universal, because there is a large class of additional vector fields with convergent dynamics which cannot be constructed by the above method. The easiest way to see this is to enumerate the set of Morse indices which can be obtained by the above method and compare this class with the class of Morse indices of arbitrary differential equations with convergent dynamics. The former set of indices are a proper subclass of the latter, therefore, the above construction cannot be universal. In general, it is a difficult open problem to construct a specific example of a differential equation with a given fixed set of equilibria, permissible Morse indices, and permissible connections between stable and unstable manifolds. A strict Lyapunov function is given for this second case as well. This strict Lyapunov function as above enables construction of a large class of examples consistent with these more complicated dynamics and indices. The determination of all the basins of attraction in the general case for these systems is also difficult and open.Air Force Office of Scientific Research (F49620-86-C-0037

    Dynamical models in neuroscience: the delay FitzHugh-Nagumo equation

    Get PDF
    Il primo modello matematico in grado di descrivere il prototipo di un sistema eccitabile assimilabile ad un neurone fu sviluppato da R. FitzHugh e J. Nagumo nel 1961. Tale modello, per quanto schematico, rappresenta un importante punto di partenza per la ricerca nell'ambito neuroscientifico delle dinamiche neuronali, ed è infatti capostipite di una serie di lavori che hanno puntato a migliorare l’accuratezza e la predicibilità dei modelli matematici per le scienze. L’elevato grado di complessità nello studio dei neuroni e delle dinamiche inter-neuronali comporta, tuttavia, che molte delle caratteristiche e delle potenzialità dell’ambito non siano ancora state comprese appieno. In questo lavoro verrà approfondito un modello ispirato al lavoro originale di FitzHugh e Nagumo. Tale modello presenta l’introduzione di un termine di self-coupling con ritardo temporale nel sistema di equazioni differenziali, diventa dunque rappresentativo di modelli di campo medio in grado di descrivere gli stati macroscopici di un ensemble di neuroni. L'introduzione del ritardo è funzionale ad una descrizione più realistica dei sistemi neuronali, e produce una dinamica più ricca e complessa rispetto a quella presente nella versione originale del modello. Sarà mostrata l'esistenza di una soluzione a ciclo limite nel modello che comprende il termine di ritardo temporale, ove tale soluzione non può essere interpretata nell’ambito delle biforcazioni di Hopf. Allo scopo di esplorare alcune delle caratteristiche basilari della modellizzazione del neurone, verrà principalmente utilizzata l’impostazione della teoria dei sistemi dinamici, integrando dove necessario con alcune nozioni provenienti dall’ambito fisiologico. In conclusione sarà riportata una sezione di approfondimento sulla integrazione numerica delle equazioni differenziali con ritardo
    • …
    corecore