776 research outputs found

    Heroes of the engram

    Get PDF
    In 1904, Richard Semon introduced the term “engram” to describe the neural substrate responsible for (or at least important in) storing and recalling memories (i.e., a memory trace). The recent introduction of a vast array of powerful new tools to probe and manipulate memory function at the cell and neuronal circuit level has spurred an explosion of interest in studying the engram. However, the present “engram renaissance” was not borne in isolation but rather builds on a long tradition of memory research. We believe it is important to acknowledge the debts our current generation of scientists owes to those scientists who have offered key ideas, persevered through failed experiments and made important discoveries before us. Examining the past can also offer a fresh perspective on the present state and future promise of the field. Given the large amount of empirical advances made in recent years, it seems particularly timely to look back and review the scientists who introduced the seminal terminology, concepts, methodological approaches, and initial data pertaining to engrams. Rather than simply list their many accomplishments, here we color in some details of the lives and milestone contributions of our seven personal heroes of the engram (Richard Semon, Karl Lashley, Donald Hebb, Wilder Penfield, Brenda Milner, James McConnell, and Richard Thompson). In reviewing their historic role, we also illustrate how their work remains relevant to today’s studies

    Development and Validation of a Sensitive Entropy-Based Measure for the Water Maze

    Get PDF
    In the water maze, mice are trained to navigate to an escape platform located below the water's surface, and spatial learning is most commonly evaluated in a probe test in which the platform is removed from the pool. While contemporary tracking software provides precise positional information of mice for the duration of the probe test, existing performance measures (e.g., percent quadrant time, platform crossings) fail to exploit fully the richness of this positional data. Using the concept of entropy (H), here we develop a new measure that considers both how focused the search is and the degree to which searching is centered on the former platform location. To evaluate how H performs compared to existing measures of water maze performance we compiled five separate databases, containing more than 1600 mouse probe tests. Random selection of individual trials from respective databases then allowed us to simulate experiments with varying sample and effect sizes. Using this Monte Carlo-based method, we found that H outperformed existing measures in its ability to detect group differences over a range of sample or effect sizes. Additionally, we validated the new measure using three models of experimentally induced hippocampal dysfunction: (1) complete hippocampal lesions, (2) genetic deletion of αCaMKII, a gene implicated in hippocampal behavioral and synaptic plasticity, and (3) a mouse model of Alzheimer's disease. Together, these data indicate that H offers greater sensitivity than existing measures, most likely because it exploits the richness of the precise positional information of the mouse throughout the probe test

    Neurogenesis-mediated forgetting minimizes proactive interference.

    Get PDF
    Established memories may interfere with the encoding of new memories, particularly when existing and new memories overlap in content. By manipulating levels of hippocampal neurogenesis, here we show that neurogenesis regulates this form of proactive interference. Increasing hippocampal neurogenesis weakens existing memories and, in doing so, facilitates the encoding of new, conflicting (but not non-conflicting) information in mice. Conversely, decreasing neurogenesis stabilizes existing memories, and impedes the encoding of new, conflicting information. These results suggest that reduced proactive interference is an adaptive benefit of neurogenesis-induced forgetting

    Flight Lieutenant Peach's observations on Burning Feet Syndrome in Far Eastern Prisoners of War 1942-45.

    Get PDF
    Introduction: ‘Burning Feet Syndrome’ affected up to one third of Far Eastern Prisoners of War in World War 2. Recently discovered medical records, produced by RAF Medical Officer Nowell Peach whilst in captivity, are the first to detail neurological examinations of patients with this condition. Methods: The 54 sets of case notes produced at the time were analysed using modern diagnostic criteria to determine if the syndrome can be retrospectively classed as neuropathic pain. Results: With a history of severe malnutrition raising the possibility of a peripheral polyneuropathy, and a neuroanatomically plausible pain distribution, this analysis showed that Burning Feet Syndrome can now be described as a ‘possible’ neuropathic pain syndrome. Conclusion: After 70 years, the data painstakingly gathered under the worst of circumstances have proved to be of interest and value in modern diagnostics of neuropathic pain

    Learning Representations that Support Extrapolation

    Full text link
    Extrapolation -- the ability to make inferences that go beyond the scope of one's experiences -- is a hallmark of human intelligence. By contrast, the generalization exhibited by contemporary neural network algorithms is largely limited to interpolation between data points in their training corpora. In this paper, we consider the challenge of learning representations that support extrapolation. We introduce a novel visual analogy benchmark that allows the graded evaluation of extrapolation as a function of distance from the convex domain defined by the training data. We also introduce a simple technique, temporal context normalization, that encourages representations that emphasize the relations between objects. We find that this technique enables a significant improvement in the ability to extrapolate, considerably outperforming a number of competitive techniques.Comment: ICML 202

    Fragment size correlations in finite systems - application to nuclear multifragmentation

    Full text link
    We present a new method for the calculation of fragment size correlations in a discrete finite system in which correlations explicitly due to the finite extent of the system are suppressed. To this end, we introduce a combinatorial model, which describes the fragmentation of a finite system as a sequence of independent random emissions of fragments. The sequence is accepted when the sum of the sizes is equal to the total size. The parameters of the model, which may be used to calculate all partition probabilities, are the intrinsic probabilities associated with the fragments. Any fragment size correlation function can be built by calculating the ratio between the partition probabilities in the data sample (resulting from an experiment or from a Monte Carlo simulation) and the 'independent emission' model partition probabilities. This technique is applied to charge correlations introduced by Moretto and collaborators. It is shown that the percolation and the nuclear statistical multifragmentaion model ({\sc smm}) are almost independent emission models whereas the nuclear spinodal decomposition model ({\sc bob}) shows strong correlations corresponding to the break-up of the hot dilute nucleus into nearly equal size fragments

    Yield scaling, size hierarchy and fluctuations of observables in fragmentation of excited heavy nuclei

    Get PDF
    Multifragmentation properties measured with INDRA are studied for single sources produced in Xe+Sn reactions in the incident energy range 32-50 A MeV and quasiprojectiles from Au+Au collisions at 80 A MeV. A comparison for both types of sources is presented concerning Fisher scaling, Zipf law, fragment size and fluctuation observables. A Fisher scaling is observed for all the data. The pseudo-critical energies extracted from the Fisher scaling are consistent between Xe+Sn central collisions and Au quasi-projectiles. In the latter case it also corresponds to the energy region at which fluctuations are maximal. The critical energies deduced from the Zipf analysis are higher than those from the Fisher analysis.Comment: 30 pages, accepted for publication in Nuclear Physics A, references correcte

    Fetal alcohol exposure leads to abnormal olfactory bulb development and impaired odor discrimination in adult mice

    Get PDF
    Background: Children whose mothers consumed alcohol during pregnancy exhibit widespread brain abnormalities and a complex array of behavioral disturbances. Here, we used a mouse model of fetal alcohol exposure to investigate relationships between brain abnormalities and specific behavioral alterations during adulthood. Results: Mice drank a 10% ethanol so
    • 

    corecore