182,499 research outputs found

    Review: \u27Neurobiology: A Functional Approach\u27

    Get PDF
    The focus of this volume is on how nervous systems work and why they work as they do in the context of “the problems that brains help organisms solve” (p. xix). Accordingly, throughout this 16-chapter publication, the focus of the author is more on neural architecture and functioning at the circuitry and systems levels of analysis than on cellular and genetic factors. Actually, I found a nicely balanced and constantly interwoven discussion of all of these levels of analysis. The opening chapter is an overview of neuroanatomy and organization, neural circuitry, and functional architecture. In order, the following chapters cover neural computation and neural plasticity; embryonic development; the brain’s responses to physical trauma, toxins, and pathogens; distal and proximal sensory systems and processing, and cognitive maps; muscles, glands, and vital bodily functions; posture and locomotion; spatial orientation and processing; stimulus identification; memory and memory dysfunctions; goal-directed actions; and species and gender brain differences. Although earlier chapters do provide a foundation for later material, I found that after the first three or four chapters, the remaining chapters by and large stand alone

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Evolving Plasticity for Autonomous Learning under Changing Environmental Conditions

    Full text link
    A fundamental aspect of learning in biological neural networks is the plasticity property which allows them to modify their configurations during their lifetime. Hebbian learning is a biologically plausible mechanism for modeling the plasticity property in artificial neural networks (ANNs), based on the local interactions of neurons. However, the emergence of a coherent global learning behavior from local Hebbian plasticity rules is not very well understood. The goal of this work is to discover interpretable local Hebbian learning rules that can provide autonomous global learning. To achieve this, we use a discrete representation to encode the learning rules in a finite search space. These rules are then used to perform synaptic changes, based on the local interactions of the neurons. We employ genetic algorithms to optimize these rules to allow learning on two separate tasks (a foraging and a prey-predator scenario) in online lifetime learning settings. The resulting evolved rules converged into a set of well-defined interpretable types, that are thoroughly discussed. Notably, the performance of these rules, while adapting the ANNs during the learning tasks, is comparable to that of offline learning methods such as hill climbing.Comment: Evolutionary Computation Journa
    corecore