26 research outputs found

    Actions do not clearly impact auditory memory

    Get PDF
    When memorizing a list of words, those that are read aloud are remembered better than those read silently, a phenomenon known as the production effect. There have been several attempts to understand the production effect, however, actions alone have not been examined as possible contributors. Stimuli that coincide with our own actions are processed differently compared to stimuli presented passively to us. These sensory response modulations may have an impact on how action-revolving inputs are stored in memory. In this study, we investigated whether actions could impact auditory memory. Participants listened to sounds presented either during or in between their actions. We measured electrophysiological responses to the sounds and tested participants’ memory of them. Results showed attenuation of sensory responses for action-coinciding sounds. However, we did not find a significant effect on memory performance. The absence of significant behavioral findings suggests that the production effect may be not dependent on the effects of actions per se. We conclude that action alone is not sufficient to improve memory performance, and thus elicit a production effect

    Sensory suppression effects to self-initiated sounds reflect the attenuation of the unspecific N1 component of the auditory ERP

    No full text
    The suppression of the auditory N1 event-related potential (ERP) to self-initiated sounds became a popular tool to tap into sensory-specific forward modeling. It is assumed that processing in the auditory cortex is attenuated due to a match between sensory stimulation and a specific sensory prediction afforded by a forward model of the motor command. The present study shows that N1 suppression was dramatically increased with long (∌3 s) stimulus onset asynchronies (SOA), whereas P2 suppression was equal in all SOA conditions (0.8, 1.6, 3.2 s). Thus, the P2 was found to be more sensitive to self-initiation effects than the N1 with short SOAs. Moreover, only the unspecific but not the sensory-specific N1 components were suppressed for self-initiated sounds suggesting that N1-suppression effects mainly reflect an attenuated orienting response. We argue that the N1-suppression effect is a rather indirect measure of sensory-specific forward models

    When loading working memory reduces distraction: Behavioral and electrophysiological evidence from an auditory-visual distraction paradigm

    No full text
    Abstract & The sensitivity of involuntary attention to top-down modulation was tested using an auditory-visual distraction task and a working memory (WM) load manipulation in subjects performing a simple visual classification task while ignoring contingent auditory stimulation. The sounds were repetitive standard tones (80%) and environmental novel sounds (20%). Distraction caused by the novel sounds was compared across a 1-back WM condition and a no-memory control condition, both involving the comparison of two digits. Event-related brain potentials (ERPs) to the sounds were recorded, and the N1/MMN (mismatch negativity), novelty-P3, and RON components were identified in the novel minus standard difference waveforms. Distraction was reduced in the WM condition, both behaviorally and as indexed by an attenuation of the late phase of the novelty-P3. The transient/change detection mechanism indexed by MMN was not affected by the WM manipulation. Sustained, slow frontal and parietal waveforms related to WM processes were found on the standard ERPs. The present results indicate that distraction caused by irrelevant novel sounds is reduced when a WM component is involved in the task, and that this modulation by WM load takes place at a late stage of the orienting response, all in all confirming that involuntary attention is under the control of top-down mechanisms. Moreover, as these results contradict predictions of the load theory of selective attention and cognitive control, it is suggested that the WM load effects on distraction depend on the nature of the distractor-target relationships. &amp

    [Data] Neural signatures of memory gain through active exploration in an oculomotor-auditory learning task

    No full text
    This project contains raw EEG, eye-tracking, and behavioural data from the study described in Sturm, S., Costa-Faidella, J. & SanMiguel, I. (2023). Neural signatures of memory gain through active exploration in an oculomotor-auditory learning task. Authorea Preprint. https://doi.org/10.22541/au.167538067.79823402/v1

    Attention capture by novel sounds: Distraction versus facilitation

    No full text
    Unexpected sounds have been shown to capture attention, triggering an orienting response. However, opposing effects of this attention capture on the performance of a concomitant visual task have been reported, in some instances leading to distraction and in others to facilitation. Moreover, the orienting response towards the unexpected stimuli can be modulated by working memory (WM) load, but the direction of this modulation has been another issue of controversy. In four experiments, we aimed to establish the critical factors that determine whether novel sounds facilitate or disrupt task performance and the modulation of these effects by WM load. Depending on the overall attentional demands of the task, novel sounds led to faster or slower responses. WM load attenuated novel sound effects, independent of their direction (facilitation or distraction). We propose a model by which the unexpected stimuli always generate the same orienting response but result in distraction or facilitation depending critically on the attentional focusing induced by the task at hand and the temporal relationship between the irrelevant and task-related stimuli

    Hearing silences: Human auditory processing relies on preactivation of sound-specific brain activity patterns

    No full text
    The remarkable capabilities displayed by humans in making sense of an overwhelming amount of sensory information cannot be explained easily if perception is viewed as a passive process. Current theoretical and computational models assume that to achieve meaningful and coherent perception, the human brain must anticipate upcoming stimulation. But how are upcoming stimuli predicted in the brain? We unmasked the neural representation of a prediction by omitting the predicted sensory input. Electrophysiological brain signals showed that when a clear prediction can be formulated, the brain activates a template of its response to the predicted stimulus before it arrives to our senses.</jats:p

    Interrelation of attention and prediction in visual processing: Effects of task-relevance and stimulus probability

    Get PDF
    The potentially interactive influence of attention and prediction was investigated by measuring event-related potentials (ERPs) in a spatial cueing task with attention (task-relevant) and prediction (probabilistic) cues. We identified distinct processing stages of this interactive influence. Firstly, in line with the attentional gain hypothesis, a larger amplitude response of the contralateral N1, and Nd1 for attended gratings was observed. Secondly, conforming to the attenuation-by-prediction hypothesis, a smaller negativity in the time window directly following the peak of the N1 component for predicted compared to unpredicted gratings was observed. In line with the hypothesis that attention and prediction interface, unpredicted/unattended stimuli elicited a larger negativity at central-parietal sites, presumably reflecting an increased prediction error signal. Thirdly, larger P3 responses to unpredicted stimuli pointed to the updating of an internal model. Attention and prediction can be considered as differentiated mechanisms that may interact at different processing stages to optimise perception
    corecore