25 research outputs found

    Adaptive coding for dynamic sensory inference

    Get PDF
    Behavior relies on the ability of sensory systems to infer properties of the environment from incoming stimuli. The accuracy of inference depends on the fidelity with which behaviorally relevant properties of stimuli are encoded in neural responses. High-fidelity encodings can be metabolically costly, but low-fidelity encodings can cause errors in inference. Here, we discuss general principles that underlie the tradeoff between encoding cost and inference error. We then derive adaptive encoding schemes that dynamically navigate this tradeoff. These optimal encodings tend to increase the fidelity of the neural representation following a change in the stimulus distribution, and reduce fidelity for stimuli that originate from a known distribution. We predict dynamical signatures of such encoding schemes and demonstrate how known phenomena, such as burst coding and firing rate adaptation, can be understood as hallmarks of optimal coding for accurate inference

    Learning, Memory, and the Role of Neural Network Architecture

    Get PDF
    The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems

    Efficient and adaptive sensory codes

    No full text
    The ability to adapt to changes in stimulus statistics is a hallmark of sensory systems. Here, we developed a theoretical framework that can account for the dynamics of adaptation from an information processing perspective. We use this framework to optimize and analyze adaptive sensory codes, and we show that codes optimized for stationary environments can suffer from prolonged periods of poor performance when the environment changes. To mitigate the adversarial effects of these environmental changes, sensory systems must navigate tradeoffs between the ability to accurately encode incoming stimuli and the ability to rapidly detect and adapt to changes in the distribution of these stimuli. We derive families of codes that balance these objectives, and we demonstrate their close match to experimentally observed neural dynamics during mean and variance adaptation. Our results provide a unifying perspective on adaptation across a range of sensory systems, environments, and sensory tasks

    Physical and Topological Constraints on Growth in Human Brain Networks

    Get PDF

    From photon to neuron: light, imaging, vision

    No full text
    A richly illustrated undergraduate textbook on the physics and biology of light Students in the physical and life sciences, and in engineering, need to know about the physics and biology of light. Recently, it has become increasingly clear that an understanding of the quantum nature of light is essential, both for the latest imaging technologies and to advance our knowledge of fundamental life processes, such as photosynthesis and human vision. From Photon to Neuron provides undergraduates with an accessible introduction to the physics of light and offers a unified view of a broad range of optical and biological phenomena. Along the way, this richly illustrated textbook builds the necessary background in neuroscience, photochemistry, and other disciplines, with applications to optogenetics, superresolution microscopy, the single-photon response of individual photoreceptor cells, and more. With its integrated approach, From Photon to Neuron can be used as the basis for interdisciplinary courses in physics, biophysics, sensory neuroscience, biophotonics, bioengineering, or nanotechnology. The goal is always for students to gain the fluency needed to derive every result for themselves, so the book includes a wealth of exercises, including many that guide students to create computer-based solutions. Supplementary online materials include real experimental data to use with the exercises. Assumes familiarity with first-year undergraduate physics and the corresponding math Overlaps the goals of the MCAT, which now includes data-based and statistical reasoning Advanced chapters and sections also make the book suitable for graduate courses An Instructor's Guide and illustration package is available to professor
    corecore