6 research outputs found

    Deep learning for chaos detection

    Get PDF
    In this article, we study how a chaos detection problem can be solved using Deep Learning techniques. We consider two classical test examples: the Logistic map as a discrete dynamical system and the Lorenz system as a continuous dynamical system. We train three types of artificial neural networks (multi-layer perceptron, convolutional neural network, and long short-term memory cell) to classify time series from the mentioned systems into regular or chaotic. This approach allows us to study biparametric and triparametric regions in the Lorenz system due to their low computational cost compared to traditional techniques

    Dynamics of excitable cells: spike-adding phenomena in action

    Get PDF
    We study the dynamics of action potentials of some electrically excitable cells: neurons and cardiac muscle cells. Bursting, following a fast–slow dynamics, is the most characteristic behavior of these dynamical systems, and the number of spikes may change due to spike-adding phenomenon. Using analytical and numerical methods we give, by focusing on the paradigmatic 3D Hindmarsh–Rose neuron model, a review of recent results on the global organization of the parameter space of neuron models with bursting regions occurring between saddle-node and homoclinic bifurcations (fold/hom bursting). We provide a generic overview of the different bursting regimes that appear in the parametric phase space of the model and the bifurcations among them. These techniques are applied in two realistic frameworks: insect movement gait changes and the appearance of Early Afterdepolarizations in cardiac dynamics

    Synaptic dependence of dynamic regimes when coupling neural populations

    Get PDF
    In this article we focus on the study of the collective dynamics of neural networks. The analysis of two recent models of coupled “next-generation” neural mass models allows us to observe different global mean dynamics of large neural populations. These models describe the mean dynamics of all-to-all coupled networks of quadratic integrate-and-fire spiking neurons. In addition, one of these models considers the influence of the synaptic adaptation mechanism on the macroscopic dynamics. We show how both models are related through a parameter and we study the evolution of the dynamics when switching from one model to the other by varying that parameter. Interestingly, we have detected three main dynamical regimes in the coupled models: Rössler-type (funnel type), bursting-type, and spiking-like (oscillator-type) dynamics. This result opens the question of which regime is the most suitable for realistic simulations of large neural networks and shows the possibility of the emergence of chaotic collective dynamics when synaptic adaptation is very weak

    Resolución numérica de métodos de inpainting

    No full text
    La restauración de imágenes o inpainting es un tipo de interpolación que sirve para rellenar las partes estropeadas de una imagen digital usando la información obtenida de las zonas intactas de la misma.El Harmonic inpainting y el TV inpainting son dos métodos de inpainting basados en ecuaciones de segundo orden que no proporcionan resultados suficientemente buenos (no cumplen el principio de buena continuación). El Cahn-Hilliard inpainting para imágenes binarias y el TV-H^{-1} inpainting son dos métodos de inpainting basados en la ecuación de Cahn-Hilliard (de cuarto orden).En este trabajo se obtiene la resolución numérica de los cuatro métodos, implementándolos en MATLAB para poder restaurar una imagen.<br /

    Deep Learning desde un Punto de Vista Matemático

    No full text
    Artificial Neural Networks are a Machine Learning algorithm based on the structure of biological neurons (these neurons are organized in layers). Deep Learning is the branch of Machine Learning that includes all the techniques used to build Deep Neural Networks (Artificial Neural Networks with at least two hidden layers) that are able to learn from data with several levels of abstraction.A Feed-forward Neural Network with fully-connected layers is a Deep Neural Network whose information flow goes forwards. The neurons that belong to consecutive layers are fully-connected. Its architecture is based on the weights of the connections between the neurons and on the bias that each neuron adds to its received information. The value of these parameters is fitted during training. This learning process is reduced to an optimization problem that can be solved using Gradient Descent or other recent algorithms as Scheduled Restart Stochastic Gradient Descent, being Back Propagation the algorithm used to compute the required derivatives. If the network is not able to learn correctly, overfitting or underfitting can arise. Other parameters of the neural network (hyperparameters) are not tuned during training. To perform, for example, image classification or prediction tasks we have to use other types of Deep Neural Networks as Convolutional Neural Networks or Recurrent Neural Networks.<br /

    Connecting chaotic regions in the Coupled Brusselator System

    No full text
    A family of vector fields describing two Brusselators linearly coupled by diffusion is considered. This model is a well-known example of how identical oscillatory systems can be coupled with a simple mechanism to create chaotic behavior. In this paper we discuss the relevance and possible relation of two chaotic regions. One of them is located using numerical techniques. The another one was first predicted by theoretical results and later studied via numerical and continuation techniques. As a conclusion, under the constrains of our exploration, both regions are not connected and, moreover, the former one has a big size, whereas the later one is quite small and hence, it might not be detected without the support of theoretical results. Our analysis includes a detailed analysis of singularities and local bifurcations that permits to provide a global parametric study of the system
    corecore