464 research outputs found

    "0-1" test chaosu

    Get PDF
    The goal of this thesis is to research the 0-1 test for chaos, its application in Matlab, and testing on suitable models. Elementary tools of the dynamical systems analysis are introduced, that are later used in the main results part of the thesis. The 0-1 test for chaos is introduced in detail, defined, and implemented in Matlab. The application is then performed on two one-dimensional discrete models where the first one is in explicit and the second one in implicit form. In both cases, simulations in dependence of the state parameter were done and main results are given - the 0-1 test for chaos, phase, and bifurcation diagrams.Hlavním cílem bakalářské práce je studium 0-1 testu chaosu, jeho implementace v Matlabu a následné testování na vhodných modelech. V práci jsou zavedeny základní nástroje analýzy dynamických systémů, které jsou později použity v části hlavních výsledků. 0-1 test chaosu je podrobně uveden, řádně definován a implementován v Matlabu. Aplikace je provedena na dvou jednodimenzionálních diskrétních modelech z nichž jeden je v explicitním a druhý v implicitním tvaru. V obou případech byly provedeny simulace v závislosti na stavovém parametru a hlavní výsledky byly demonstrovány formou 0-1 testu chaosu, fázových a bifurkačních diagramů.470 - Katedra aplikované matematikyvýborn

    Mathematical modelling and brain dynamical networks

    Get PDF
    In this thesis, we study the dynamics of the Hindmarsh-Rose (HR) model which studies the spike-bursting behaviour of the membrane potential of a single neuron. We study the stability of the HR system and compute its Lyapunov exponents (LEs). We consider coupled general sections of the HR system to create an undirected brain dynamical network (BDN) of Nn neurons. Then, we study the concepts of upper bound of mutual information rate (MIR) and synchronisation measure and their dependence on the values of electrical and chemical couplings. We analyse the dynamics of neurons in various regions of parameter space plots for two elementary examples of 3 neurons with two different types of electrical and chemical couplings. We plot the upper bound Ic and the order parameter rho (the measure of synchronisation) and the two largest Lyapunov exponents LE1 and LE2 versus the chemical coupling gn and electrical coupling gl. We show that, even for small number of neurons, the dynamics of the system depends on the number of neurons and the type of coupling strength between them. Finally, we evolve a network of Hindmarsh-Rose neurons by increasing the entropy of the system. In particular, we choose the Kolmogorov-Sinai entropy: HKS (Pesin identity) as the evolution rule. First, we compute the HKS for a network of 4 HR neurons connected simultaneously by two undirected electrical and two undirected chemical links. We get different entropies with the use of different values for both the chemical and electrical couplings. If the entropy of the system is positive, the dynamics of the system is chaotic and if it is close to zero, the trajectory of the system converges to one of the fixed points and loses energy. Then, we evolve a network of 6 clusters of 10 neurons each. Neurons in each cluster are connected only by electrical links and their connections form small-world networks. The six clusters connect to each other only by chemical links. We compare between the combined effect of chemical and electrical couplings with the two concepts, the information flow capacity Ic and HKS in evolving the BDNs and show results that the brain networks might evolve based on the principle of the maximisation of their entropies

    Alternative numerical computation of one-sided Levy and Mittag-Leffler distributions

    Get PDF
    We consider here the recently proposed closed form formula in terms of the Meijer G-functions for the probability density functions gα(x)g_\alpha(x) of one-sided L\'evy stable distributions with rational index α=l/k\alpha=l/k, with 0<α<10<\alpha<1. Since one-sided L\'evy and Mittag-Leffler distributions are known to be related, this formula could also be useful for calculating the probability density functions ρα(x)\rho_\alpha(x) of the latter. We show, however, that the formula is computationally inviable for fractions with large denominators, being unpractical even for some modest values of ll and kk. We present a fast and accurate numerical scheme, based on an early integral representation due to Mikusinski, for the evaluation of gα(x)g_\alpha(x) and ρα(x)\rho_\alpha(x), their cumulative distribution function and their derivatives for any real index α(0,1)\alpha\in (0,1). As an application, we explore some properties of these probability density functions. In particular, we determine the location and value of their maxima as functions of the index α\alpha. We show that α0.567\alpha \approx 0.567 and α0.605\alpha \approx 0.605 correspond, respectively, to the one-sided L\'evy and Mittag-Leffler distributions with shortest maxima. We close by discussing how our results can elucidate some recently described dynamical behavior of intermittent systems.Comment: 6 pages, 5 figures. New references added, final version to appear in PRE. Numerical code available at http://vigo.ime.unicamp.br/dist

    Strange nonchaotic stars

    Full text link
    The unprecedented light curves of the Kepler space telescope document how the brightness of some stars pulsates at primary and secondary frequencies whose ratios are near the golden mean, the most irrational number. A nonlinear dynamical system driven by an irrational ratio of frequencies generically exhibits a strange but nonchaotic attractor. For Kepler's "golden" stars, we present evidence of the first observation of strange nonchaotic dynamics in nature outside the laboratory. This discovery could aid the classification and detailed modeling of variable stars.Comment: 5 pages, 4 figures, published in Physical Review Letter

    Structured chaos shapes spike-response noise entropy in balanced neural networks

    Get PDF
    Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. A striking feature of these networks is that they are chaotic. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for the entropy of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complimented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows -- a phenomenon that depends on ``extensive chaos," as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.Comment: 9 pages, 5 figure
    corecore