126 research outputs found

    Coupled variability in primary sensory areas and the hippocampus during spontaneous activity

    Get PDF
    The cerebral cortex is an anatomically divided and functionally specialized structure. It includes distinct areas, which work on different states over time. The structural features of spiking activity in sensory cortices have been characterized during spontaneous and evoked activity. However, the coordination among cortical and sub-cortical neurons during spontaneous activity across different states remains poorly characterized. We addressed this issue by studying the temporal coupling of spiking variability recorded from primary sensory cortices and hippocampus of anesthetized or freely behaving rats. During spontaneous activity, spiking variability was highly correlated across primary cortical sensory areas at both small and large spatial scales, whereas the cortico-hippocampal correlation was modest. This general pattern of spiking variability was observed under urethane anesthesia, as well as during waking, slow-wave sleep and rapid-eye-movement sleep, and was unchanged by novel stimulation. These results support the notion that primary sensory areas are strongly coupled during spontaneous activity.project NORTE-01-0145-FEDER-000013, supported by the Northern Portugal Regional Operational Programme (NORTE 2020), under the Portugal 2020 Partnership Agreement, through the European Regional Development Fund (FEDER). NAPV was supported by Centro Universitario do Rio Grande do Norte, Champalimaud Foundation, and Brazilian National Council for Scientific and Technological Development (CNPq, Grant 249991/2013-6), CC-S (SFRH/BD/51992/2012). AJR (IF/00883/2013). SR by UFRN, CNPq (Research Productivity Grant 308775/2015-5), and S. Paulo Research Foundation FAPESP - Center for Neuromathematics (Grant 2013/07699-0)info:eu-repo/semantics/publishedVersio

    Short Conduction Delays Cause Inhibition Rather than Excitation to Favor Synchrony in Hybrid Neuronal Networks of the Entorhinal Cortex

    Get PDF
    How stable synchrony in neuronal networks is sustained in the presence of conduction delays is an open question. The Dynamic Clamp was used to measure phase resetting curves (PRCs) for entorhinal cortical cells, and then to construct networks of two such neurons. PRCs were in general Type I (all advances or all delays) or weakly type II with a small region at early phases with the opposite type of resetting. We used previously developed theoretical methods based on PRCs under the assumption of pulsatile coupling to predict the delays that synchronize these hybrid circuits. For excitatory coupling, synchrony was predicted and observed only with no delay and for delays greater than half a network period that cause each neuron to receive an input late in its firing cycle and almost immediately fire an action potential. Synchronization for these long delays was surprisingly tight and robust to the noise and heterogeneity inherent in a biological system. In contrast to excitatory coupling, inhibitory coupling led to antiphase for no delay, very short delays and delays close to a network period, but to near-synchrony for a wide range of relatively short delays. PRC-based methods show that conduction delays can stabilize synchrony in several ways, including neutralizing a discontinuity introduced by strong inhibition, favoring synchrony in the case of noisy bistability, and avoiding an initial destabilizing region of a weakly type II PRC. PRCs can identify optimal conduction delays favoring synchronization at a given frequency, and also predict robustness to noise and heterogeneity

    Learning, Memory, and the Role of Neural Network Architecture

    Get PDF
    The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems

    The Inactivation Principle: Mathematical Solutions Minimizing the Absolute Work and Biological Implications for the Planning of Arm Movements

    Get PDF
    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality criteria may be applied to a large range of biological movements

    Rhythm Generation through Period Concatenation in Rat Somatosensory Cortex

    Get PDF
    Rhythmic voltage oscillations resulting from the summed activity of neuronal populations occur in many nervous systems. Contemporary observations suggest that coexistent oscillations interact and, in time, may switch in dominance. We recently reported an example of these interactions recorded from in vitro preparations of rat somatosensory cortex. We found that following an initial interval of coexistent gamma (∼25 ms period) and beta2 (∼40 ms period) rhythms in the superficial and deep cortical layers, respectively, a transition to a synchronous beta1 (∼65 ms period) rhythm in all cortical layers occurred. We proposed that the switch to beta1 activity resulted from the novel mechanism of period concatenation of the faster rhythms: gamma period (25 ms)+beta2 period (40 ms) = beta1 period (65 ms). In this article, we investigate in greater detail the fundamental mechanisms of the beta1 rhythm. To do so we describe additional in vitro experiments that constrain a biologically realistic, yet simplified, computational model of the activity. We use the model to suggest that the dynamic building blocks (or motifs) of the gamma and beta2 rhythms combine to produce a beta1 oscillation that exhibits cross-frequency interactions. Through the combined approach of in vitro experiments and mathematical modeling we isolate the specific components that promote or destroy each rhythm. We propose that mechanisms vital to establishing the beta1 oscillation include strengthened connections between a population of deep layer intrinsically bursting cells and a transition from antidromic to orthodromic spike generation in these cells. We conclude that neural activity in the superficial and deep cortical layers may temporally combine to generate a slower oscillation

    Pan-cancer analysis of whole genomes

    Get PDF
    Cancer is driven by genetic change, and the advent of massively parallel sequencing has enabled systematic documentation of this variation at the whole-genome scale(1-3). Here we report the integrative analysis of 2,658 whole-cancer genomes and their matching normal tissues across 38 tumour types from the Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium of the International Cancer Genome Consortium (ICGC) and The Cancer Genome Atlas (TCGA). We describe the generation of the PCAWG resource, facilitated by international data sharing using compute clouds. On average, cancer genomes contained 4-5 driver mutations when combining coding and non-coding genomic elements; however, in around 5% of cases no drivers were identified, suggesting that cancer driver discovery is not yet complete. Chromothripsis, in which many clustered structural variants arise in a single catastrophic event, is frequently an early event in tumour evolution; in acral melanoma, for example, these events precede most somatic point mutations and affect several cancer-associated genes simultaneously. Cancers with abnormal telomere maintenance often originate from tissues with low replicative activity and show several mechanisms of preventing telomere attrition to critical levels. Common and rare germline variants affect patterns of somatic mutation, including point mutations, structural variants and somatic retrotransposition. A collection of papers from the PCAWG Consortium describes non-coding mutations that drive cancer beyond those in the TERT promoter(4); identifies new signatures of mutational processes that cause base substitutions, small insertions and deletions and structural variation(5,6); analyses timings and patterns of tumour evolution(7); describes the diverse transcriptional consequences of somatic mutation on splicing, expression levels, fusion genes and promoter activity(8,9); and evaluates a range of more-specialized features of cancer genomes(8,10-18).Peer reviewe
    corecore