708 research outputs found

    Neural population coding: combining insights from microscopic and mass signals

    Get PDF
    Behavior relies on the distributed and coordinated activity of neural populations. Population activity can be measured using multi-neuron recordings and neuroimaging. Neural recordings reveal how the heterogeneity, sparseness, timing, and correlation of population activity shape information processing in local networks, whereas neuroimaging shows how long-range coupling and brain states impact on local activity and perception. To obtain an integrated perspective on neural information processing we need to combine knowledge from both levels of investigation. We review recent progress of how neural recordings, neuroimaging, and computational approaches begin to elucidate how interactions between local neural population activity and large-scale dynamics shape the structure and coding capacity of local information representations, make them state-dependent, and control distributed populations that collectively shape behavior

    Combined behavioral and neural investigations of pup retrieval

    Get PDF
    The ability to adequately adapt to a dramatically changing environment is crucial for an animal’s survival. When female mice give birth to their offspring, their environment changes drastically and they immediately need to care for the offspring, thereby ensuring the offspring’s wellbeing. Pups completely transform the environment around the mouse, triggering a number of new behaviors, as they provide a slew of new sensory inputs, including tactile and olfactory, but also auditory. Pups emit ultrasonic vocalizations (USVs) when isolated outside the nest, triggering retrieval behavior in mothers (MTs). After pups have returned to the nest and are cared for, the USV emission ceases. Interestingly, not only MTs but also virgin mice can perform pup retrieval, provided that they either have experience with pups in their home cage or are repeatedly exposed to pups in a pup retrieval task. Those two animal groups are referred to as experienced (EVs) and naive virgins (NVs). Studies have shown that excitatory neurons in the auditory cortex of MTs and EVs respond more strongly to pup calls over time. However, these studies have been performed under head-restrained unnatural conditions. Here, we provide a framework in which MTs, EVs and NVs retrieve pups in a semi-natural, freely behaving setting. During the experiment, they carry a head-mounted miniscope that allows for imaging neural activity in multiple neurons in the auditory cortex. The entire multisensory scenery is therefore accessible to mice, which was shown to impact auditory responses to pup calls. In our study, we show differences in behavioral performances of these three groups, with MTs displaying the most skilled and fine-tuned pup retrieval behavior, already highly effective during the final pregnancy stage. EVs show slightly reduced pup retrieval abilities, but superior to NVs, which retrieve pups effectively only after a few days. Additionally, we discovered that not only pups emitted USVs, but also adult mice vocalized. Intriguingly, they vocalized significantly more when pups were present in the behavioral arena, as compared to when they were alone. Clear pup call responsive neurons in the auditory cortex of all groups were scarce. Nevertheless, the overall neuronal population showed significant responses to pup calls at least in MTs, less so in EVs and least pronounced in NVs. Strikingly, other more global and behaviorally relevant events, such as pup retrievals and nest entries and exits, showed a distinct neural signature. Despite the scarcity of clear single cell responses to pup calls, the population of auditory cortex neurons carried information about pup call presence throughout all sessions in all groups, measured by a decoding analysis. This population code could be described as a sparse and dynamic code containing a few highly informative neurons, i.e. high weight neurons, that carried most of the decoding weight in a given session. This sparsity was most pronounced in MTs and least so in NVs. Besides, these high weight neurons were largely non-overlapping with high weight neurons for other non-pup call related event types. When relating single trial pup call decoding accuracies with the associated behavioral performance in a given trial, we could identify a significant relationship in EVs that was absent in MTs and NVs, suggesting that improved single trial decoding accuracies were linked to improved pup retrieval abilities. Altogether, this study shows how different pup exposure regimes can affect the learning of an essential offspring caring behavior and, that these different learning types differently enhance the neural representations of associated sensory cues

    Metacognitive Instruction Delivered through a Socio Constructive Methodology and the Development of Listening Skills in a Beginner EFL class

    Get PDF
    Listening instruction in EFL has acquired great interest over the last 40 years. This qualitative study reports an action research intervention with 17 beginner EFL learners with the purpose to determine to what extent the implementation of Vandergrift’s (2012) Listening Metacognitive Pedagogical Cycle delivered through Michaelsen’s Team-Based Learning Methodology affects the development of understanding Main Ideas, Details and their ability to decode, as well as to what extent it increases their metacognitive awareness. Findings show interesting potential for this approach to listening instruction in the long term. However, in the short term, results are not positive. This is likely due to high cognitive demands that overload beginner EFL students’ working memory capacity and individual neurological differences, as well as motivational factors, which represent a limitation of this study. In spite of its qualitative nature, the results are accompanied by quantitative data to strengthen it. Key words: Metacognition, Team-Based Learning, Listening, EFL, A2MaestríaMagister en la Enseñanza del Ingle

    The temporal pattern of impulses in primary afferents analogously encodes touch and hearing information

    Full text link
    An open question in neuroscience is the contribution of temporal relations between individual impulses in primary afferents in conveying sensory information. We investigated this question in touch and hearing, while looking for any shared coding scheme. In both systems, we artificially induced temporally diverse afferent impulse trains and probed the evoked perceptions in human subjects using psychophysical techniques. First, we investigated whether the temporal structure of a fixed number of impulses conveys information about the magnitude of tactile intensity. We found that clustering the impulses into periodic bursts elicited graded increases of intensity as a function of burst impulse count, even though fewer afferents were recruited throughout the longer bursts. The interval between successive bursts of peripheral neural activity (the burst-gap) has been demonstrated in our lab to be the most prominent temporal feature for coding skin vibration frequency, as opposed to either spike rate or periodicity. Given the similarities between tactile and auditory systems, second, we explored the auditory system for an equivalent neural coding strategy. By using brief acoustic pulses, we showed that the burst-gap is a shared temporal code for pitch perception between the modalities. Following this evidence of parallels in temporal frequency processing, we next assessed the perceptual frequency equivalence between the two modalities using auditory and tactile pulse stimuli of simple and complex temporal features in cross-sensory frequency discrimination experiments. Identical temporal stimulation patterns in tactile and auditory afferents produced equivalent perceived frequencies, suggesting an analogous temporal frequency computation mechanism. The new insights into encoding tactile intensity through clustering of fixed charge electric pulses into bursts suggest a novel approach to convey varying contact forces to neural interface users, requiring no modulation of either stimulation current or base pulse frequency. Increasing control of the temporal patterning of pulses in cochlear implant users might improve pitch perception and speech comprehension. The perceptual correspondence between touch and hearing not only suggests the possibility of establishing cross-modal comparison standards for robust psychophysical investigations, but also supports the plausibility of cross-sensory substitution devices

    The hearing hippocampus

    Get PDF
    The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information – whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia

    Using Phonically Based E-books to Develop Reading Fluency

    Get PDF
    The purpose of this chapter is to describe the ‘Tales of Jud the Rat’ reading fluency programme and its logic, and to present preliminary results from its use as a form of e-learning. The first section of the chapter provides an overview of the development of the ‘The Tales Jud the Rat’ series. Literature relevant to the neurolinguistic basis of the materials is then reviewed. Results from initial case study and the first cohort of children who have worked on this programme with their parents are presented in the third section, while the final section of the chapter provides an evaluation of the current status of the programme and indicates its potential uses

    Ultralow-frequency neural entrainment to pain

    Get PDF
    Nervous systems exploit regularities in the sensory environment to predict sensory input, adjust behavior, and thereby maximize fitness. Entrainment of neural oscillations allows retaining temporal regularities of sensory information, a prerequisite for prediction. Entrainment has been extensively described at the frequencies of periodic inputs most commonly present in visual and auditory landscapes (e.g., >0.5 Hz). An open question is whether neural entrainment also occurs for regularities at much longer timescales. Here, we exploited the fact that the temporal dynamics of thermal stimuli in natural environment can unfold very slowly. We show that ultralow-frequency neural oscillations preserved a long-lasting trace of sensory information through neural entrainment to periodic thermo-nociceptive input as low as 0.1 Hz. Importantly, revealing the functional significance of this phenomenon, both power and phase of the entrainment predicted individual pain sensitivity. In contrast, periodic auditory input at the same ultralow frequency did not entrain ultralow-frequency oscillations. These results demonstrate that a functionally significant neural entrainment can occur at temporal scales far longer than those commonly explored. The non-supramodal nature of our results suggests that ultralow-frequency entrainment might be tuned to the temporal scale of the statistical regularities characteristic of different sensory modalities

    Ultralow-frequency neural entrainment to pain

    Get PDF
    Nervous systems exploit regularities in the sensory environment to predict sensory input, adjust behavior, and thereby maximize fitness. Entrainment of neural oscillations allows retaining temporal regularities of sensory information, a prerequisite for prediction. Entrainment has been extensively described at the frequencies of periodic inputs most commonly present in visual and auditory landscapes (e.g., >0.5 Hz). An open question is whether neural entrainment also occurs for regularities at much longer timescales. Here, we exploited the fact that the temporal dynamics of thermal stimuli in natural environment can unfold very slowly. We show that ultralow-frequency neural oscillations preserved a long-lasting trace of sensory information through neural entrainment to periodic thermo-nociceptive input as low as 0.1 Hz. Importantly, revealing the functional significance of this phenomenon, both power and phase of the entrainment predicted individual pain sensitivity. In contrast, periodic auditory input at the same ultralow frequency did not entrain ultralow-frequency oscillations. These results demonstrate that a functionally significant neural entrainment can occur at temporal scales far longer than those commonly explored. The non-supramodal nature of our results suggests that ultralow-frequency entrainment might be tuned to the temporal scale of the statistical regularities characteristic of different sensory modalities

    Voice and speech perception in autism : a systematic review

    Get PDF
    Autism spectrum disorders (ASD) are characterized by persistent impairments in social communication and interaction, restricted and repetitive behavior. In the original description of autism by Kanner (1943) the presence of emotional impairments was already emphasized (self-absorbed, emotionally cold, distanced, and retracted). However, little research has been conducted focusing on auditory perception of vocal emotional cues, being the audio-visual comprehension most commonly explored instead. Similarly to faces, voices play an important role in social interaction contexts in which individuals with ASD show impairments. The aim of the current systematic review was to integrate evidence from behavioral and neurobiological studies for a more comprehensive understanding of voice processing abnormalities in ASD. Among different types of information that the human voice may provide, we hypothesize particular deficits with vocal affect information processing by individuals with ASD. The relationship between vocal stimuli impairments and disrupted Theory of Mind in Autism is discussed. Moreover, because ASD are characterized by deficits in social reciprocity, further discussion of the abnormal oxytocin system in individuals with ASD is performed as a possible biological marker for abnormal vocal affect information processing and social interaction skills in ASD population
    corecore