80 research outputs found

    Evidence for Perceptual “Trapping” and Adaptation in Multistable Binocular Rivalry

    Get PDF
    AbstractWhen a different pattern is presented to each eye, the perceived image spontaneously alternates between the two patterns (binocular rivalry); the dynamics of these bistable alternations are known to be stochastic. Examining multistable binocular rivalry (involving four dominant percepts), we demonstrated path dependence and on-line adaptation, which were equivalent whether perceived patterns were formed by single-eye dominance or by mixed-eye dominance. The spontaneous perceptual transitions tended to get trapped within a pair of related global patterns (e.g., opponent shapes and symmetric patterns), and during such trapping, the probability of returning to the repeatedly experienced patterns gradually decreased (postselection pattern adaptation). These results suggest that the structure of global shape coding and its adaptation play a critical role in directing spontaneous alternations of visual awareness in perceptual multistability

    Simultaneous shape repulsion and global assimilation in the perception of aspect ratio

    Get PDF
    Although local interactions involving orientation and spatial frequency are well understood, less is known about spatial interactions involving higher level pattern features. We examined interactive coding of aspect ratio, a prevalent twodimensional feature. We measured perception of two simultaneously flashed ellipses by randomly post-cueing one of them and having observers indicate its aspect ratio. Aspect ratios interacted in two ways. One manifested as an aspect-ratiorepulsion effect. For example, when a slightly tall ellipse and a taller ellipse were simultaneously flashed, the less tall ellipse appeared flatter and the taller ellipse appeared even taller. This repulsive interaction was long range, occurring even when the ellipses were presented in different visual hemifields. The other interaction manifested as a global assimilation effect. An ellipse appeared taller when it was a part of a global vertical organization than when it was a part of a global horizontal organization. The repulsion and assimilation effects temporally dissociated as the former slightly strengthened, and the latter disappeared when the ellipse-to-mask stimulus onset asynchrony was increased from 40 to 140 ms. These results are consistent with the idea that shape perception emerges from rapid lateral and hierarchical neural interactions

    Visual speech differentially modulates beta, theta, and high gamma bands in auditory cortex

    Get PDF
    Speech perception is a central component of social communication. While principally an auditory process, accurate speech perception in everyday settings is supported by meaningful information extracted from visual cues (e.g., speech content, timing, and speaker identity). Previous research has shown that visual speech modulates activity in cortical areas subserving auditory speech perception, including the superior temporal gyrus (STG), potentially through feedback connections from the multisensory posterior superior temporal sulcus (pSTS). However, it is unknown whether visual modulation of auditory processing in the STG is a unitary phenomenon or, rather, consists of multiple temporally, spatially, or functionally distinct processes. To explore these questions, we examined neural responses to audiovisual speech measured from intracranially implanted electrodes within the temporal cortex of 21 patients undergoing clinical monitoring for epilepsy. We found that visual speech modulates auditory processes in the STG in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. Before speech onset, visual information increased high-gamma power in the posterior STG and suppressed beta power in mid-STG regions, suggesting crossmodal prediction of speech signals in these areas. After sound onset, visual speech decreased theta power in the middle and posterior STG, potentially reflecting a decrease in sustained feedforward auditory activity. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception and provide a crucial map for subsequent studies to identify the types of visual features that are encoded by these separate mechanisms.This study was supported by NIH Grant R00 DC013828 A. Beltz was supported by the Jacobs Foundation.http://deepblue.lib.umich.edu/bitstream/2027.42/167729/1/OriginalManuscript.pdfDescription of OriginalManuscript.pdf : Preprint of the article "Multiple auditory responses to visual speech"SEL

    Feature and conjunction information from brief visual displays

    No full text
    The feature integration theory of object perception (Treisman & Gelade, 1980) suggests that the perception of multidimensional stimuli requires that attention be serially directed to the items in a visual display in order to correctly conjoin features into objects, while the perception of features does not require serial attention. Under conditions in which the serial focusing of attention is disrupted by reducing display duration, available information about conjunctions of two features should not exceed the independent information available about the constituent features. Three experiments using a partial report paradigm employing a location cue were conducted in order to test this prediction. Subjects viewed colored letter displays that varied in cue-display stimulus onset asynchrony. The dependent measure was accuracy of response. Results suggest that a small amount of information from a separate representation of conjunctions of features may be accessible.Arts, Faculty ofPsychology, Department ofGraduat

    Long-Term Speeding in Perceptual Switches Mediated by Attention-Dependent Plasticity in Cortical Visual Processing

    Get PDF
    SummaryBinocular rivalry has been extensively studied to understand the mechanisms that control switches in visual awareness and much has been revealed about the contributions of stimulus and cognitive factors. Because visual processes are fundamentally adaptive, however, it is also important to understand how experience alters the dynamics of perceptual switches. When observers viewed binocular rivalry repeatedly over many days, the rate of perceptual switches increased as much as 3-fold. This long-term rivalry speeding exhibited a pattern of image-feature specificity that ruled out primary contributions from strategic and nonsensory factors and implicated neural plasticity occurring in both low- and high-level visual processes in the ventral stream. Furthermore, the speeding occurred only when the rivaling patterns were voluntarily attended, suggesting that the underlying neural plasticity selectively engages when stimuli are behaviorally relevant. Long-term rivalry speeding may thus reflect broader mechanisms that facilitate quick assessments of signals that contain multiple behaviorally relevant interpretations

    EEG state-trajectory instability and speed reveal global rules of intrinsic spatiotemporal neural dynamics.

    No full text
    Spatiotemporal dynamics of EEG/MEG (electro-/magneto-encephalogram) have typically been investigated by applying time-frequency decomposition and examining amplitude-amplitude, phase-phase, or phase-amplitude associations between combinations of frequency bands and scalp sites, primarily to identify neural correlates of behaviors and traits. Instead, we directly extracted global EEG spatiotemporal dynamics as trajectories of k-dimensional state vectors (k = the number of estimated current sources) to investigate potential global rules governing neural dynamics. We chose timescale-dependent measures of trajectory instability (approximately the 2nd temporal derivative) and speed (approximately the 1st temporal derivative) as state variables, that succinctly characterized trajectory forms. We compared trajectories across posterior, central, anterior, and lateral scalp regions as the current sources under those regions may serve distinct functions. We recorded EEG while participants rested with their eyes closed (likely engaged in spontaneous thoughts) to investigate intrinsic neural dynamics. Some potential global rules emerged. Time-averaged trajectory instability from all five regions tightly converged (with their variability minimized) at the level of generating nearly unconstrained but slightly conservative turns (~100° on average) on the timescale of ~25 ms, suggesting that spectral-amplitude profiles are globally adjusted to maintain this convergence. Further, within-frequency and cross-frequency phase relations appear to be independently coordinated to reduce average trajectory speed and increase the variability in trajectory speed and instability in a relatively timescale-invariant manner, and to make trajectories less oscillatory. Future research may investigate the functional relevance of these intrinsic global-dynamics rules by examining how they adjust to various sensory environments and task demands or remain invariant. The current results also provide macroscopic constraints for quantitative modeling of neural dynamics as the timescale dependencies of trajectory instability and speed are relatable to oscillatory dynamics

    Visual Attention Modulates Insight Versus Analytic Solving of Verbal Problems

    Get PDF
    Behavioral and neuroimaging findings indicate that distinct cognitive and neural processes underlie solving problems with sudden insight. Moreover, people with less focused attention sometimes perform better on tests of insight and creative problem solving. However, it remains unclear whether different states of attention, within individuals, influence the likelihood of solving problems with insight or with analysis. In this experiment, participants (N = 40) performed a baseline block of verbal problems, then performed one of two visual tasks, each emphasizing a distinct aspect of visual attention, followed by a second block of verbal problems to assess change in performance. After participants engaged in a center-focused flanker task requiring relatively focused visual attention, they reported solving more verbal problems with analytic processing. In contrast, after participants engaged in a rapid object identification task requiring attention to broad space and weak associations, they reported solving more verbal problems with insight. These results suggest that general attention mechanisms influence both visual attention task performance and verbal problem solving

    Probabilistic, entropy-maximizing control of large-scale neural synchronization.

    No full text
    Oscillatory neural activity is dynamically controlled to coordinate perceptual, attentional and cognitive processes. On the macroscopic scale, this control is reflected in the U-shaped deviations of EEG spectral-power dynamics from stochastic dynamics, characterized by disproportionately elevated occurrences of the lowest and highest ranges of power. To understand the mechanisms that generate these low- and high-power states, we fit a simple mathematical model of synchronization of oscillatory activity to human EEG data. The results consistently indicated that the majority (~95%) of synchronization dynamics is controlled by slowly adjusting the probability of synchronization while maintaining maximum entropy within the timescale of a few seconds. This strategy appears to be universal as the results generalized across oscillation frequencies, EEG current sources, and participants (N = 52) whether they rested with their eyes closed, rested with their eyes open in a darkened room, or viewed a silent nature video. Given that precisely coordinated behavior requires tightly controlled oscillatory dynamics, the current results suggest that the large-scale spatial synchronization of oscillatory activity is controlled by the relatively slow, entropy-maximizing adjustments of synchronization probability (demonstrated here) in combination with temporally precise phase adjustments (e.g., phase resetting generated by sensorimotor interactions). Interestingly, we observed a modest but consistent spatial pattern of deviations from the maximum-entropy rule, potentially suggesting that the mid-central-posterior region serves as an "entropy dump" to facilitate the temporally precise control of spectral-power dynamics in the surrounding regions

    A phase-shifting anterior-posterior network organizes global phase relations.

    No full text
    Prior research has identified a variety of task-dependent networks that form through inter-regional phase-locking of oscillatory activity that are neural correlates of specific behaviors. Despite ample knowledge of task-specific functional networks, general rules governing global phase relations have not been investigated. To discover such general rules, we focused on phase modularity, measured as the degree to which global phase relations in EEG comprised distinct synchronized clusters interacting with one another at large phase lags. Synchronized clusters were detected with a standard community-detection algorithm, and the degree of phase modularity was quantified by the index q. Notably, we found that the mechanism controlling phase modularity is remarkably simple. A network comprising anterior-posterior long-distance connectivity coherently shifted phase relations from low-angles (|Δθ| 3π/4) in high-modularity states (top 5% in q), accounting for fluctuations in phase modularity. This anterior-posterior network may play a fundamental functional role as (1) it controls phase modularity across a broad range of frequencies (3-50 Hz examined) in different behavioral conditions (resting with the eyes closed or watching a silent nature video) and (2) neural interactions (measured as power correlations) in beta-to-gamma bands were consistently elevated in high-modularity states. These results may motivate future investigations into the functional roles of phase modularity as well as the anterior-posterior network that controls it
    • …
    corecore