16 research outputs found

    Does the accuracy of the visual perception decrease in the area of your own hand movement? Differences in perception of contrast in the area where the hand moves compared with the area where there is no movement

    Get PDF
    KĂ€esoleva töö raames uuriti sensoorse tĂ€psuse vĂ€henemist virtuaalreaalsuse (VR) keskkonnas. Uurimistöö aluseks oli aktiivse jĂ€reldamise teooria. Sellest lĂ€htuvalt pĂŒstitati uurimiskĂŒsimus, kuidas inimese enda kĂ€e liikumine mĂ”jutab samas piirkonnas asetseva sihtstiimuli kontrasti tajumist. Katses osales 27 katseisikut, kelle ĂŒlesandeks oli valida VR keskkonnas kuvatud kahest erineva kontrastiga kerast kontrastsem, sooritades samal ajal kĂ€eviibet kerade piirkonnas. AnalĂŒĂŒsi tulemustest ilmnes, et kera kontrasti, mis asetses „kĂ€e taga“ ei hinnatud kontrollstiimuliga vĂ”rreldes statistiliselt oluliselt erinevaks

    Cultural differences in perceiving sounds generated by others: self matters

    Get PDF
    Sensory consequences resulting from own movements receive different neural processing compared to externally generated sensory consequences (e.g., by a computer), leading to sensory attenuation, i.e., a reduction in perceived loudness or brain evoked responses. However, discrepant findings exist from different cultural regions about whether sensory attenuation is also present for sensory consequences generated by others. In this study, we performed a cross culture (between Chinese and British) comparison on the processing of sensory consequences (perceived loudness) from self and others compared to an external source in the auditory domain. We found a cultural difference in processing sensory consequences generated by others, with only Chinese and not British showing the sensory attenuation effect. Sensory attenuation in this case was correlated with independent self-construal scores. The sensory attenuation effect for self-generated sensory consequences was not replicated. However, a correlation with delusional ideation was observed for British. These findings are discussed with respects to mechanisms of sensory attenuation

    Voice-selective prediction alterations in nonclinical voice hearers

    Get PDF
    Auditory verbal hallucinations (AVH) are a cardinal symptom of psychosis but also occur in 6-13% of the general population. Voice perception is thought to engage an internal forward model that generates predictions, preparing the auditory cortex for upcoming sensory feedback. Impaired processing of sensory feedback in vocalization seems to underlie the experience of AVH in psychosis, but whether this is the case in nonclinical voice hearers remains unclear. The current study used electroencephalography (EEG) to investigate whether and how hallucination predisposition (HP) modulates the internal forward model in response to self-initiated tones and self-voices. Participants varying in HP (based on the Launay-Slade Hallucination Scale) listened to self-generated and externally generated tones or self-voices. HP did not affect responses to self vs. externally generated tones. However, HP altered the processing of the self-generated voice: increased HP was associated with increased pre-stimulus alpha power and increased N1 response to the self-generated voice. HP did not affect the P2 response to voices. These findings confirm that both prediction and comparison of predicted and perceived feedback to a self-generated voice are altered in individuals with AVH predisposition. Specific alterations in the processing of self-generated vocalizations may establish a core feature of the psychosis continuum.The Authors gratefully acknowledge all the participants who collaborated in the study, and particularly Dr. Franziska Knolle for feedback on stimulus generation, Carla Barros for help with scripts for EEG time-frequency analysis, and Dr. Celia Moreira for her advice on mixed linear models. This work was supported by the Portuguese Science National Foundation (FCT; grant numbers PTDC/PSI-PCL/116626/2010, IF/00334/2012, PTDC/MHCPCN/0101/2014) awarded to APP

    Action-outcome learning and prediction shape the window of simultaneity of audiovisual outcomes

    Get PDF
    AbstractTo form a coherent representation of the objects around us, the brain must group the different sensory features composing these objects. Here, we investigated whether actions contribute in this grouping process. In particular, we assessed whether action-outcome learning and prediction contribute to audiovisual temporal binding. Participants were presented with two audiovisual pairs: one pair was triggered by a left action, and the other by a right action. In a later test phase, the audio and visual components of these pairs were presented at different onset times. Participants judged whether they were simultaneous or not. To assess the role of action-outcome prediction on audiovisual simultaneity, each action triggered either the same audiovisual pair as in the learning phase (‘predicted’ pair), or the pair that had previously been associated with the other action (‘unpredicted’ pair). We found the time window within which auditory and visual events appeared simultaneous increased for predicted compared to unpredicted pairs. However, no change in audiovisual simultaneity was observed when audiovisual pairs followed visual cues, rather than voluntary actions. This suggests that only action-outcome learning promotes temporal grouping of audio and visual effects. In a second experiment we observed that changes in audiovisual simultaneity do not only depend on our ability to predict what outcomes our actions generate, but also on learning the delay between the action and the multisensory outcome. When participants learned that the delay between action and audiovisual pair was variable, the window of audiovisual simultaneity for predicted pairs increased, relative to a fixed action-outcome pair delay. This suggests that participants learn action-based predictions of audiovisual outcome, and adapt their temporal perception of outcome events based on such predictions

    The detection of phase amplitude coupling during sensory processing

    Get PDF
    There is increasing interest in understanding how the phase and amplitude of distinct neural oscillations might interact to support dynamic communication within the brain. In particular, previous work has demonstrated a coupling between the phase of low frequency oscillations and the amplitude (or power) of high frequency oscillations during certain tasks, termed phase amplitude coupling (PAC). For instance, during visual processing in humans, PAC has been reliably observed between ongoing alpha (8-13 Hz) and gamma-band (>40 Hz) activity. However, the application of PAC metrics to electrophysiological data can be challenging due to numerous methodological issues and lack of coherent approaches within the field. Therefore, in this article we outline the various analysis steps involved in detecting PAC, using an openly available MEG dataset from 16 participants performing an interactive visual task. Firstly, we localized gamma and alpha-band power using the Fieldtrip toolbox, and extracted time courses from area V1, defined using a multimodal parcelation scheme. These V1 responses were analyzed for changes in alpha-gamma PAC, using four common algorithms. Results showed an increase in alpha (7-13 Hz)-gamma (40-100 Hz) PAC in response to the visual grating stimulus, though specific patterns of coupling were somewhat dependent upon the algorithm employed. Additionally, post-hoc analyses showed that these results were not driven by the presence of non-sinusoidal oscillations, and that trial length was sufficient to obtain reliable PAC estimates. Finally, throughout the article, methodological issues and practical guidelines for ongoing PAC research will be discussed

    Attenuation of visual evoked responses to hand and saccade-initiated flashes

    Get PDF
    Sensory attenuation refers to reduced brain responses to self-initiated sensations relative to those produced by the external world. It is a low-level process that may be linked to higher-level cognitive tasks such as reality monitoring. The phenomenon is often explained by prediction error mechanisms of universal applicability to sensory modality; however, it is most widely reported for auditory stimuli resulting from self-initiated hand movements. The present series of event-related potential (ERP) experiments explored the generalizability of sensory attenuation to the visual domain by exposing participants to flashes initiated by either their own button press or volitional saccade and comparing these conditions to identical, computer-initiated stimuli. The key results showed that the largest reduction of anterior visual N1 amplitude occurred for saccade-initiated flashes, while button press-initiated flashes evoked an intermediary response between the saccade-initiated and externally initiated conditions. This indicates that sensory attenuation occurs for visual stimuli and suggests that the degree of electrophysiological attenuation may relate to the causal likelihood of pairings between the type of motor action and the modality of its sensory response

    Enhanced alpha-oscillations in visual cortex during anticipation of self-generated visual stimulation

    Get PDF
    The perceived intensity of sensory stimuli is reduced when these stimuli are caused by the observer's actions. This phenomenon is traditionally explained by forward models of sensory action-outcome, which arise from motor processing. Although these forward models critically predict anticipatory modulation of sensory neural processing, neurophysiological evidence for anticipatory modulation is sparse and has not been linked to perceptual data showing sensory attenuation. By combining a psychophysical task involving contrast discrimination with source-level time-frequency analysis of MEG data, we demonstrate that the amplitude of alpha-oscillations in visual cortex is enhanced before the onset of a visual stimulus when the identity and onset of the stimulus are controlled by participants' motor actions. Critically, this prestimulus enhancement of alpha-amplitude is paralleled by psychophysical judgments of a reduced contrast for this stimulus. We suggest that alpha-oscillations in visual cortex preceding self-generated visual stimulation are a likely neurophysiological signature of motor-induced sensory anticipation and mediate sensory attenuation. We discuss our results in relation to proposals that attribute generic inhibitory functions to alpha-oscillations in prioritizing and gating sensory information via top-down control

    L’onda alpha nell’EEG: dalle dinamiche neurali all’utilizzo nella neuroergonomia

    Get PDF
    L’elaborato ha lo scopo di fornire una panoramica sull'oscillazione alpha, descrivendone la generazione, le possibili funzioni cognitive e come le sue variazioni possono caratterizzare lo stato mentale di un individuo. Il lavoro inizia esplorando le basi dell'elettroencefalografia e come l’attività elettroencefalografica ù generata a livello neuronale. Si esamina poi la dinamica delle oscillazioni neurali, in particolare gli eventi di sincronizzazione e desincronizzazione correlati agli stimoli. Viene poi introdotto il ritmo alpha e si identificano i suoi possibili generatori. In passato, l'oscillazione alpha veniva considerata il riflesso di uno stato di idling o disattivazione corticale, ma oggi si ipotizza che agisca tramite un meccanismo di inibizione attiva. Si presentano tre teorie per spiegare questo meccanismo: l'ipotesi di "inhibition-timing", l'ipotesi di "inhibition by gating" e l'ipotesi di "pulsed inhibition". A partire da queste vengono individuate quattro diverse caratteristiche funzionali dell'alpha, incentrate sull'attenzione visiva: alpha l'inibitore, alpha il percettore, alpha il predittore ed alpha il comunicatore. Si descrivono i meccanismi dell'attenzione visiva e come alpha varia in base ai processi attentivi. Gli studi citati nell’elaborato alimentano l’ipotesi della funzione da parte di alpha di disengagement delle regioni cerebrali non rilevanti al task, specialmente se viene presentato uno stimolo distraente da dover sopprimere attivamente. Infine, si esamina la relazione tra l'alpha e alcuni costrutti neuroergonomici come il carico mentale, l’affaticamento e la sonnolenza. Si mostra come l'aumento del carico mentale comporti in generale una diminuzione dell'attività alpha, mentre l’affaticamento e la sonnolenza comportino in generale un aumento della sua potenza

    Profil temporel de l’efficacitĂ© du traitement visuel en reconnaissance d’objets et de visages

    Full text link
    Les variations d’efficacitĂ© du traitement visuel dans le temps ont Ă©tĂ© Ă©tudiĂ©es par Ă©chantillonnage temporel alĂ©atoire. Vingt-quatre adultes ont identifiĂ© des stimuli composĂ©s de bruit blanc visuel et d’images d’objets familiers (expĂ©rience 1) ou de visages cĂ©lĂšbres (expĂ©rience 2). Le ratio signal-bruit variait Ă  travers le temps selon des fonctions d’échantillonnage gĂ©nĂ©rĂ©es par l’intĂ©gration d’ondes sinusoĂŻdales de diffĂ©rentes frĂ©quences (5 Ă  55 Hz) et de phases et amplitudes alĂ©atoires. Des vecteurs de classification (VC) temporels ont Ă©tĂ© calculĂ©s en soustrayant la somme pondĂ©rĂ©e des ratios signal-bruit associĂ©s aux mauvaises rĂ©ponses de celle associĂ©e aux bonnes rĂ©ponses. Des images de classification (IC) temps-frĂ©quence ont Ă©tĂ© obtenues en appliquant la mĂȘme procĂ©dure aux rĂ©sultats d’analyses temps-frĂ©quence rĂ©alisĂ©es sur la fonction d’échantillonnage de chaque essai. Les VC temporels des deux expĂ©riences sont trĂšs variables entre les participants. Par contre, les IC temps-frĂ©quence sont remarquablement similaires Ă  travers les participants (cohĂ©rence inter-sujets de .93 et .57 pour l’expĂ©rience 1 et 2 respectivement). Des comparaisons par test t nous indiquent de nombreuses diffĂ©rences entre les IC temps-frĂ©quence des objets et visages familiers, mais aussi des objets non familiers et des mots analysĂ©s dans des Ă©tudes prĂ©cĂ©dentes. Ainsi, ces IC sont sensibles Ă  la classe de stimuli prĂ©sentĂ©s, mais aussi Ă  la familiaritĂ© de ces derniers. Les rĂ©sultats tĂ©moignent d’une variation rapide dans l’efficacitĂ© de l’encodage visuel durant les 200 premiĂšres millisecondes d’exposition au stimulus et suggĂšrent que les IC du domaine temps-frĂ©quence reflĂštent un aspect hautement fondamental du traitement visuel, hypothĂ©tiquement rattachĂ© aux oscillations cĂ©rĂ©brales.Variations in visual processing effectiveness through time were investigated using random temporal stimulus sampling. Twenty-four adults named photographs of either familiar objects (experiment 1) or famous faces (experiment 2). Stimuli were made by a linear combination of the target image and high density white visual noise. Signal-to-noise ratio varied throughout the 200 ms stimulus duration. A new temporal sampling function was generated on each trial by the integration of random amplitude and phase sinusoidal waves of frequency between 5 and 55 Hz (in 5 Hz steps). Temporal classification vectors (CV) were calculated by subtracting the weighted sum of the signal-to-noise ratio associated to errors from that associated to correct responses. Time-frequency classification images (CI) were obtained by applying the same procedure on the outcome of time-frequency analyses applied to the sampling functions of each trial. In both experiments, the temporal CVs were highly variable across participants, but the time-frequency CIs were remarkably similar across participants (inter-subject coherence of .93 and .57 for experiments 1 and 2 respectively). T-tests revealed multiple differences between the time-frequency CIs obtained with familiar objects and faces, but also with non-familiar objects and words analyzed in previous studies. Therefore, theses CIs are sensitive to stimulus type, but also to stimulus familiarity. The present results indicate rapid variations of visual encoding effectiveness in the initial 200 ms of stimulus exposure and suggests that the time-frequency CIs tap a highly fundamental aspect of visual processing, hypothetically linked to brain oscillations
    corecore