339 research outputs found

    Interlayer antisynchronization in degree-biased duplex networks

    Full text link
    With synchronization being one of nature's most ubiquitous collective behaviors, the field of network synchronization has experienced tremendous growth, leading to significant theoretical developments. However, most of these previous studies consider uniform connection weights and undirected networks with positive coupling. In the present article, we incorporate the asymmetry in a two-layer multiplex network by assigning the ratio of the adjacent nodes' degrees as the weights to the intralayer edges. Despite the presence of degree-biased weighting mechanism and attractive-repulsive coupling strengths, we are able to find the necessary conditions for intralayer synchronization and interlayer antisynchronization and test whether these two macroscopic states can withstand demultiplexing in a network. During the occurrence of these two states, we analytically calculate the oscillator's amplitude. In addition to deriving the local stability conditions for interlayer antisynchronization via the master stability function approach, we also construct a suitable Lyapunov function to determine a sufficient condition for global stability. We provide numerical evidence to show the necessity of negative interlayer coupling strength for the occurrence of antisynchronization, and such repulsive interlayer coupling coefficients can not destroy intralayer synchronization.Comment: 16 pages, 5 figures (Accepted for publication in the journal Physical Review E

    Adaptive dynamical networks

    Get PDF
    It is a fundamental challenge to understand how the function of a network is related to its structural organization. Adaptive dynamical networks represent a broad class of systems that can change their connectivity over time depending on their dynamical state. The most important feature of such systems is that their function depends on their structure and vice versa. While the properties of static networks have been extensively investigated in the past, the study of adaptive networks is much more challenging. Moreover, adaptive dynamical networks are of tremendous importance for various application fields, in particular, for the models for neuronal synaptic plasticity, adaptive networks in chemical, epidemic, biological, transport, and social systems, to name a few. In this review, we provide a detailed description of adaptive dynamical networks, show their applications in various areas of research, highlight their dynamical features and describe the arising dynamical phenomena, and give an overview of the available mathematical methods developed for understanding adaptive dynamical networks

    Nutrient timing, metabolism, and health in humans

    Get PDF

    Self-learning mechanical circuits

    Full text link
    Computation, mechanics and materials merge in biological systems, which can continually self-optimize through internal adaptivity across length scales, from cytoplasm and biofilms to animal herds. Recent interest in such material-based computation uses the principles of energy minimization, inertia and dissipation to solve optimization problems. Although specific computations can be performed using dynamical systems, current implementations of material computation lack the ability to self-learn. In particular, the inverse problem of designing self-learning mechanical systems which can use physical computations to continuously self-optimize remains poorly understood. Here we introduce the concept of self-learning mechanical circuits, capable of taking mechanical inputs from changing environments and constantly updating their internal state in response, thus representing an entirely mechanical information processing unit. Our circuits are composed of a new mechanical construct: an adaptive directed spring (ADS), which changes its stiffness in a directional manner, enabling neural network-like computations. We provide both a theoretical foundation and experimental realization of these elastic learning units and demonstrate their ability to autonomously uncover patterns hidden in environmental inputs. By implementing computations in an embodied physical manner, the system directly interfaces with its environment, thus broadening the scope of its learning behavior. Our results pave the way towards the construction of energy-harvesting, adaptive materials which can autonomously and continuously sense and self-optimize to gain function in different environments

    Brain Computations and Connectivity [2nd edition]

    Get PDF
    This is an open access title available under the terms of a CC BY-NC-ND 4.0 International licence. It is free to read on the Oxford Academic platform and offered as a free PDF download from OUP and selected open access locations. Brain Computations and Connectivity is about how the brain works. In order to understand this, it is essential to know what is computed by different brain systems; and how the computations are performed. The aim of this book is to elucidate what is computed in different brain systems; and to describe current biologically plausible computational approaches and models of how each of these brain systems computes. Understanding the brain in this way has enormous potential for understanding ourselves better in health and in disease. Potential applications of this understanding are to the treatment of the brain in disease; and to artificial intelligence which will benefit from knowledge of how the brain performs many of its extraordinarily impressive functions. This book is pioneering in taking this approach to brain function: to consider what is computed by many of our brain systems; and how it is computed, and updates by much new evidence including the connectivity of the human brain the earlier book: Rolls (2021) Brain Computations: What and How, Oxford University Press. Brain Computations and Connectivity will be of interest to all scientists interested in brain function and how the brain works, whether they are from neuroscience, or from medical sciences including neurology and psychiatry, or from the area of computational science including machine learning and artificial intelligence, or from areas such as theoretical physics

    Spatial processing of conspecific signals in weakly electric fish: from sensory image to neural population coding

    Get PDF
    In this dissertation, I examine how an animal’s nervous system encodes spatially realistic conspecific signals in their environment and how the encoding mechanisms support behavioral sensitivity. I begin by modeling changes in the electrosensory signals exchanged by weakly electric fish in a social context. During this behavior, I estimate how the spatial structure of conspecific stimuli influences sensory responses at the electroreceptive periphery. I then quantify how space is represented in the hindbrain, specifically in the primary sensory area called the electrosensory lateral line lobe. I show that behavioral sensitivity is influenced by the heterogeneous properties of the pyramidal cell population. I further demonstrate that this heterogeneity serves to start segregating spatial and temporal information early in the sensory pathway. Lastly, I characterize the accuracy of spatial coding in this network and predict the role of network elements, such as correlated noise and feedback, in shaping the spatial information. My research provides a comprehensive understanding of spatial coding in the first stages of sensory processing in this system and allows us to better understand how network dynamics shape coding accuracy

    Biological Neuron Voltage Recordings, Driving and Fitting Mathematical Neuronal Models

    Get PDF
    The manual process of comparing biological recordings from electrophysiological experiments to their mathematical models is time-consuming and subjective. To address this problem, we have created a blended system that allows for objective, high-throughput, and computationally inexpensive comparisons of biological and mathematical models by developing a quantitative measure of likeness (error function). Voltage recordings from biological neurons, mathematically simulated voltage times series, and their transformations are inputted into the error function. These transformations and measurements are the action potential (AP) frequency, voltage moving average, voltage envelopes, and the probability of post-synaptic channels being open. The previously recorded biological voltage times series are first, translated into mathematical data to input into mathematical neurons, creating what we call a blended system. Using the sea slug Melibe Leonina\u27s swimming central pattern generator (CPG) as our circuit to compare and the source of our biological recordings, we performed a grid search of the conductance of the inhibitory and excitatory synapse found that a weighted sum of simple functions is required for a comprehensive view of a neuron\u27s rhythmic behavior. The blended system was also shown to be able to act as rhythm directors like pacemakers and drivers of Dendronotus Iris swimming interneuron (Si) cells and was able to replicate the perturbations of biological recordings. After verification steps using different configurations, calculated mean and variance of rhythmic characteristics, as well as recordings created from data augmentation. The form of data augmentation introduced can be generalized to other biological recordings or any time series. With all these tools developed and expanding the parameter dimensions a hypothesis was posited that there is a contralateral electric synapse not previously included in the Melibe CPG model
    • …
    corecore