13 research outputs found
Dysregulation of temporal dynamics of synchronous neural activity in adolescents on autism spectrum
Autism spectrum disorder is increasingly understood to be based on atypical signal transfer among multiple interconnected networks in the brain. Relative temporal patterns of neural activity have been shown to underlie both the altered neurophysiology and the altered behaviors in a variety of neurogenic disorders. We assessed brain network dynamics variability in autism spectrum disorders (ASD) using measures of synchronization (phase-locking) strength, and timing of synchronization and desynchronization of neural activity (desynchronization ratio) across frequency bands of resting-state electroencephalography (EEG). Our analysis indicated that frontoparietal synchronization is higher in ASD but with more short periods of desynchronization. It also indicates that the relationship between the properties of neural synchronization and behavior is different in ASD and typically developing populations. Recent theoretical studies suggest that neural networks with a high desynchronization ratio have increased sensitivity to inputs. Our results point to the potential significance of this phenomenon to the autistic brain. This sensitivity may disrupt the production of an appropriate neural and behavioral responses to external stimuli. Cognitive processes dependent on the integration of activity from multiple networks maybe, as a result, particularly vulnerable to disruption
Recommended from our members
Neural mechanisms of Event Visibility in sign languages
Event structure in sign languages is reflected in the manual dynamics of verb production. As signed event structure is visible (iconic), non-signers are able to recognize it, despite having no sign lexicon. In this EEG study, hearing non-signers were presented with telic and atelic verb signs, followed by a lexical classification task in their native language. Behavioral data confirmed that non-signers classified both telic and atelic signs with above-chance accuracy. ERP waveforms indicated that non-signers identified the perceptual differences in motion features when viewing telic/atelic signs, and used different processing mechanisms when integrating the perceptual information with linguistic concepts in their native language. Non-signers appeared to segment visual sign language input into discrete events, as they attempted to map the observed visual forms to concepts, and label them linguistically. This mechanism suggests a potential evolutionary pathway for co-optation of perceptual features into the linguistic structure of sign languages
Recommended from our members
Application of machine learning to signal entrainment identifies predictive processing in sign language.
We present the first analysis of multi-frequency neural entrainment to dynamic visual features which drives sign language comprehension. Using the measure of EEG coherence to optical flow in video stimuli, we are able to classify fluent signers’ brain states as denoting online language comprehension, or non-comprehension during watching of non-linguistic videos that are equivalent in low-level spatiotemporal features and high-level scene parameters. The data also indicates that lower frequencies, such as 1 Hz and 4 Hz, contribute substantially to brain state classification, indicating relevance of neural coherence to the signal at these frequencies to language comprehension. These findings suggest that fluent signers rely on predictive processing during online comprehension