20,741 research outputs found
Recommended from our members
Arousal regulates frequency tuning in primary auditory cortex.
Changes in arousal influence cortical sensory representations, but the synaptic mechanisms underlying arousal-dependent modulation of cortical processing are unclear. Here, we use 2-photon Ca2+ imaging in the auditory cortex of awake mice to show that heightened arousal, as indexed by pupil diameter, broadens frequency-tuned activity of layer 2/3 (L2/3) pyramidal cells. Sensory representations are less sparse, and the tuning of nearby cells more similar when arousal increases. Despite the reduction in selectivity, frequency discrimination by cell ensembles improves due to a decrease in shared trial-to-trial variability. In vivo whole-cell recordings reveal that mechanisms contributing to the effects of arousal on sensory representations include state-dependent modulation of membrane potential dynamics, spontaneous firing, and tone-evoked synaptic potentials. Surprisingly, changes in short-latency tone-evoked excitatory input cannot explain the effects of arousal on the broadness of frequency-tuned output. However, we show that arousal strongly modulates a slow tone-evoked suppression of recurrent excitation underlying lateral inhibition [H. K. Kato, S. K. Asinof, J. S. Isaacson, Neuron, 95, 412-423, (2017)]. This arousal-dependent "network suppression" gates the duration of tone-evoked responses and regulates the broadness of frequency tuning. Thus, arousal can shape tuning via modulation of indirect changes in recurrent network activity
Transfer Effect of Speech-sound Learning on Auditory-motor Processing of Perceived Vocal Pitch Errors
Speech perception and production are intimately linked. There is evidence that speech motor learning results in changes to auditory processing of speech. Whether speech motor control benefits from perceptual learning in speech, however, remains unclear. This event-related potential study investigated whether speech-sound learning can modulate the processing of feedback errors during vocal pitch regulation. Mandarin speakers were trained to perceive five Thai lexical tones while learning to associate pictures with spoken words over 5 days. Before and after training, participants produced sustained vowel sounds while they heard their vocal pitch feedback unexpectedly perturbed. As compared to the pre-training session, the magnitude of vocal compensation significantly decreased for the control group, but remained consistent for the trained group at the post-training session. However, the trained group had smaller and faster N1 responses to pitch perturbations and exhibited enhanced P2 responses that correlated significantly with their learning performance. These findings indicate that the cortical processing of vocal pitch regulation can be shaped by learning new speech-sound associations, suggesting that perceptual learning in speech can produce transfer effects to facilitating the neural mechanisms underlying the online monitoring of auditory feedback regarding vocal production
Recommended from our members
Evidence that indirect inhibition of saccade initiation improves saccade accuracy
Saccadic eye-movements to a visual target are less accurate if there are distracters close to its location (local distracters). The addition of more distracters, remote from the target location (remote distracters), invokes an involuntary increase in the response latency of the saccade and attenuates the effect of local distracters on accuracy. This may be due to the target and distracters directly competing (direct route) or to the remote distracters acting to impair the ability to disengage from fixation (indirect route). To distinguish between these we examined the development of saccade competition by recording saccade latency and accuracy responses made to a target and local distracter compared with those made with an addition of a remote distracter. The direct route would predict that the remote distracter impacts on the developing competition between target and local distracter, while the indirect route would predict no change as the accuracy benefit here derives from accessing the same competitive process but at a later stage. We found that the presence of the remote distracter did not change the pattern of accuracy improvement. This suggests that the remote distracter was acting along an indirect route that inhibits disengagement from fixation, slows saccade initiation, and enables more accurate saccades to be made
Fast, invariant representation for human action in the visual system
Humans can effortlessly recognize others' actions in the presence of complex
transformations, such as changes in viewpoint. Several studies have located the
regions in the brain involved in invariant action recognition, however, the
underlying neural computations remain poorly understood. We use
magnetoencephalography (MEG) decoding and a dataset of well-controlled,
naturalistic videos of five actions (run, walk, jump, eat, drink) performed by
different actors at different viewpoints to study the computational steps used
to recognize actions across complex transformations. In particular, we ask when
the brain discounts changes in 3D viewpoint relative to when it initially
discriminates between actions. We measure the latency difference between
invariant and non-invariant action decoding when subjects view full videos as
well as form-depleted and motion-depleted stimuli. Our results show no
difference in decoding latency or temporal profile between invariant and
non-invariant action recognition in full videos. However, when either form or
motion information is removed from the stimulus set, we observe a decrease and
delay in invariant action decoding. Our results suggest that the brain
recognizes actions and builds invariance to complex transformations at the same
time, and that both form and motion information are crucial for fast, invariant
action recognition
Stereotyping starlings are more 'pessimistic'.
Negative affect in humans and animals is known to cause individuals to interpret ambiguous stimuli pessimistically, a phenomenon termed 'cognitive bias'. Here, we used captive European starlings (Sturnus vulgaris) to test the hypothesis that a reduction in environmental conditions, from enriched to non-enriched cages, would engender negative affect, and hence 'pessimistic' biases. We also explored whether individual differences in stereotypic behaviour (repetitive somersaulting) predicted 'pessimism'. Eight birds were trained on a novel conditional discrimination task with differential rewards, in which background shade (light or dark) determined which of two covered dishes contained a food reward. The reward was small when the background was light, but large when the background was dark. We then presented background shades intermediate between those trained to assess the birds' bias to choose the dish associated with the smaller food reward (a 'pessimistic' judgement) when the discriminative stimulus was ambiguous. Contrary to predictions, changes in the level of cage enrichment had no effect on 'pessimism'. However, changes in the latency to choose and probability of expressing a choice suggested that birds learnt rapidly that trials with ambiguous stimuli were unreinforced. Individual differences in performance of stereotypies did predict 'pessimism'. Specifically, birds that somersaulted were more likely to choose the dish associated with the smaller food reward in the presence of the most ambiguous discriminative stimulus. We propose that somersaulting is part of a wider suite of behavioural traits indicative of a stress response to captive conditions that is symptomatic of a negative affective state
A retinotopic attentional trace after saccadic eye movements: evidence from event-related potentials
Saccadic eye movements are a major source of disruption to visual stability, yet we experience little of this disruption. We can keep track of the same object across multiple saccades. It is generally assumed that visual stability is due to the process of remapping, in which retinotopically organized maps are updated to compensate for the retinal shifts caused by eye movements. Recent behavioral and ERP evidence suggests that visual attention is also remapped, but that it may still leave a residual retinotopic trace immediately after a saccade. The current study was designed to further examine electrophysiological evidence for such a retinotopic trace by recording ERPs elicited by stimuli that were presented immediately after a saccade (80 msec SOA). Participants were required to maintain attention at a specific location (and to memorize this location) while making a saccadic eye movement. Immediately after the saccade, a visual stimulus was briefly presented at either the attended location (the same spatiotopic location), a location that matched the attended location retinotopically (the same retinotopic location), or one of two control locations. ERP data revealed an enhanced P1 amplitude for the stimulus presented at the retinotopically matched location, but a significant attenuation for probes presented at the original attended location. These results are consistent with the hypothesis that visuospatial attention lingers in retinotopic coordinates immediately following gaze shifts
- âŠ