4,998 research outputs found

    Building population models for large-scale neural recordings: opportunities and pitfalls

    Get PDF
    Modern recording technologies now enable simultaneous recording from large numbers of neurons. This has driven the development of new statistical models for analyzing and interpreting neural population activity. Here we provide a broad overview of recent developments in this area. We compare and contrast different approaches, highlight strengths and limitations, and discuss biological and mechanistic insights that these methods provide

    Neural mechanisms of resistance to peer influence in early adolescence

    Get PDF
    During the shift from a parent-dependent child to a fully autonomous adult, peers take on a significant role in shaping the adolescent’s behaviour. Peer-derived influences are not always positive, however. Here we explore neural correlates of inter-individual differences in the probability of resisting peer influence in early adolescence. Using functional magnetic-resonance imaging (fMRI), we found striking differences between 10-year old children with high and low resistance to peer influence in their brain activity during observation of angry hand-movements and angry facial expressions: compared with subjects with low resistance to peer influence, individuals with high resistance showed a highly coordinated brain activity in neural systems underlying perception of action and decision making. These findings suggest that the probability of resisting peer influence depends on neural interactions during observation of emotion-laden actions

    Statistical Physics and Representations in Real and Artificial Neural Networks

    Full text link
    This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics

    Dynamics on the manifold: Identifying computational dynamical activity from neural population recordings

    Get PDF
    The question of how the collective activity of neural populations gives rise to complex behaviour is fundamental to neuroscience. At the core of this question lie considerations about how neural circuits can perform computations that enable sensory perception, decision making, and motor control. It is thought that such computations are implemented through the dynamical evolution of distributed activity in recurrent circuits. Thus, identifying dynamical structure in neural population activity is a key challenge towards a better understanding of neural computation. At the same time, interpreting this structure in light of the computation of interest is essential for linking the time-varying activity patterns of the neural population to ongoing computational processes. Here, we review methods that aim to quantify structure in neural population recordings through a dynamical system defined in a low-dimensional latent variable space. We discuss advantages and limitations of different modelling approaches and address future challenges for the field
    corecore