1,188 research outputs found

    Fooling the eyes: the influence of a sound-induced visual motion illusion on eye movements

    Get PDF
    The question of whether perceptual illusions influence eye movements is critical for the long-standing debate regarding the separation between action and perception. To test the role of auditory context on a visual illusion and on eye movements, we took advantage of the fact that the presence of an auditory cue can successfully modulate illusory motion perception of an otherwise static flickering object (sound-induced visual motion effect). We found that illusory motion perception modulated by an auditory context consistently affected saccadic eye movements. Specifically, the landing positions of saccades performed towards flickering static bars in the periphery were biased in the direction of illusory motion. Moreover, the magnitude of this bias was strongly correlated with the effect size of the perceptual illusion. These results show that both an audio-visual and a purely visual illusion can significantly affect visuo-motor behavior. Our findings are consistent with arguments for a tight link between perception and action in localization tasks

    EyeRIS User's Manual

    Full text link

    Representation, space and Hollywood Squares: Looking at things that aren't there anymore

    Get PDF
    It has been argued that the human cognitive system is capable of using spatial indexes or oculomotor coordinates to relieve working memory load (Ballard, Hayhoe, Pook & Rao, 1997) track multiple moving items through occlusion (Scholl & Pylyshyn, 1999) or link incompatible cognitive and sensorimotor codes (Bridgeman and Huemer, 1998). Here we examine the use of such spatial information in memory for semantic information. Previous research has often focused on the role of task demands and the level of automaticity in the encoding of spatial location in memory tasks. We present five experiments where location is irrelevant to the task, and participants' encoding of spatial information is measured implicitly by their looking behavior during recall. In a paradigm developed from Spivey and Geng (submitted), participants were presented with pieces of auditory, semantic information as part of an event occurring in one of four regions of a computer screen. In front of a blank grid, they were asked a question relating to one of those facts. Under certain conditions it was found that during the question period participants made significantly more saccades to the empty region of space where the semantic information had been previously presented. Our findings are discussed in relation to previous research on memory and spatial location, the dorsal and ventral streams of the visual system, and the notion of a cognitive-perceptual system using spatial indexes to exploit the stability of the external world

    Activation of superior colliculi in humans during visual exploration

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Visual, oculomotor, and – recently – cognitive functions of the superior colliculi (SC) have been documented in detail in non-human primates in the past. Evidence for corresponding functions of the SC in humans is still rare. We examined activity changes in the human tectum and the lateral geniculate nuclei (LGN) in a visual search task using functional magnetic resonance imaging (fMRI) and anatomically defined regions of interest (ROI). Healthy subjects conducted a free visual search task and two voluntary eye movement tasks with and without irrelevant visual distracters. Blood oxygen level dependent (BOLD) signals in the SC were compared to activity in the inferior colliculi (IC) and LGN.</p> <p>Results</p> <p>Neural activity increased during free exploration only in the SC in comparison to both control tasks. Saccade frequency did not exert a significant effect on BOLD signal changes. No corresponding differences between experimental tasks were found in the IC or the LGN. However, while the IC revealed no signal increase from the baseline, BOLD signal changes at the LGN were consistently positive in all experimental conditions.</p> <p>Conclusion</p> <p>Our data demonstrate the involvement of the SC in a visual search task. In contrast to the results of previous studies, signal changes could not be seen to be driven by either visual stimulation or oculomotor control on their own. Further, we can exclude the influence of any nearby neural structures (e.g. pulvinar, tegmentum) or of typical artefacts at the brainstem on the observed signal changes at the SC. Corresponding to findings in non-human primates, our data support a dependency of SC activity on functions beyond oculomotor control and visual processing.</p

    Cybersickness in Virtual Reality Questionnaire (CSQ-VR):A validation and comparison against SSQ and VRSQ

    Get PDF
    Cybersickness is a drawback of virtual reality (VR), which also affects the cognitive and motor skills of the users. The Simulator Sickness Questionnaire (SSQ), and its variant, the Virtual Reality Sickness Questionnaire (VRSQ) are two tools that measure cybersickness. However, both tools suffer from important limitations, which raises concerns about their suitability. Two versions of the Cybersickness in VR Questionnaire (CSQ-VR), a paper-and-pencil and a 3D –VR version, were developed. Validation and comparison of CSQ-VR against SSQ and VRSQ were performed. Thirty-nine participants were exposed to 3 rides with linear and angular accelerations in VR. Assessments of cognitive and psychomotor skills were performed at baseline and after each ride. The validity of both versions of CSQ_VR was confirmed. Notably, CSQ-VR demonstrated substantially better internal consistency than both SSQ and VRSQ. Also, CSQ-VR scores had significantly better psychometric properties in detecting a temporary decline in performance due to cybersickness. Pupil size was a significant predictor of cybersickness intensity. In conclusion, the CSQ-VR is a valid assessment of cybersickness, with superior psychometric properties to SSQ and VRSQ. The CSQ-VR enables the assessment of cybersickness during VR exposure, and it benefits from examining pupil size, a biomarker of cybersickness.  </p

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space

    White matter microstructure and atypical visual orienting in 7 month-olds at risk for autism

    Get PDF
    pre-printObjective: The authors sought to determine whether specific patterns of oculo-motor functioning and visual orientingcharacterize 7-month-old infants who later meet criteria for an autism spectrum disorder (ASD) and to identify the neural correlates of these behaviors. Method:Data were collected from 97 infants, of whom 16 were high-familial-risk infants later classified as having an ASD, 40 were high-familial-risk infants who did not later meet ASD criteria (high-risk negative), and 41 were low- risk infants. All infants underwent an eye-tracking task at a mean age of 7 months and a clinical assessment at a mean age of 25 months. Diffusion-weighted imaging data were acquired for 84 of the infants at 7 months. Primary outcome measures included average saccadic reaction time in a visually guided saccade procedure and radial diffusivity (an index of white matter organization) in fiber tracts that included corticospinal path-ways and the splenium and genu of the corpus callosum. Results: Visual orienting latencies were longer in 7-month-old infants who ex-pressed ASD symptoms at 25 months compared with both high-risk negative infants and low-risk infants. Visual orienting latencies were uniquely associated with the microstructural organization of the splenium of the corpus callosum in low-risk infants, but this association was not apparent in infants later classified as having an ASD. Conclusions: Flexiblyandef ficientlyorienting to salient information in the environment is critical for subsequent cognitive and social-cognitive development. Atypical visual orienting may represent an early prodromal feature of an ASD, and abnormal functional specialization of posterior cortical circuits directly informs a novel model of ASD pathogenesis

    Studies of the Ability to Hold the Eye in Eccentric Gaze: Measurements in Normal Subjects with the Head Erect

    Get PDF
    We studied the ability to hold the eyes in eccentric horizontal or vertical gaze angles in 68 normal humans, age range 19-56. Subjects attempted to sustain visual fixation of a briefly flashed target located 30 in the horizontal plane and 15 in the vertical plane in a dark environment. Conventionally, the ability to hold eccentric gaze is estimated by fitting centripetal eye drifts by exponential curves and calculating the time constant (t(sub c)) of these slow phases of gazeevoked nystagmus. Although the distribution of time-constant measurements (t(sub c)) in our normal subjects was extremely skewed due to occasional test runs that exhibited near-perfect stability (large t(sub c) values), we found that log10(tc) was approximately normally distributed within classes of target direction. Therefore, statistical estimation and inference on the effect of target direction was performed on values of z identical with log10t(sub c). Subjects showed considerable variation in their eyedrift performance over repeated trials; nonetheless, statistically significant differences emerged: values of tc were significantly higher for gaze elicited to targets in the horizontal plane than for the vertical plane (P less than 10(exp -5), suggesting eccentric gazeholding is more stable in the horizontal than in the vertical plane. Furthermore, centrifugal eye drifts were observed in 13.3, 16.0 and 55.6% of cases for horizontal, upgaze and downgaze tests, respectively. Fifth percentile values of the time constant were estimated to be 10.2 sec, 3.3 sec and 3.8 sec for horizontal, upward and downward gaze, respectively. The difference between horizontal and vertical gazeholding may be ascribed to separate components of the velocity position neural integrator for eye movements, and to differences in orbital mechanics. Our statistical method for representing the range of normal eccentric gaze stability can be readily applied in a clinical setting to patients who were exposed to environments that may have modified their central integrators and thus require monitoring. Patients with gaze-evoked nystagmus can be flagged by comparing to the above established normative criteria

    Examination of Cybersickness in Virtual Reality: The Role of Individual Differences, Effects on Cognitive Functions & Motor Skills, and Intensity Differences During and After Immersion

    Full text link
    Background: Given that VR is applied in multiple domains, understanding the effects of cyber-sickness on human cognition and motor skills and the factors contributing to cybersickness gains urgency. This study aimed to explore the predictors of cybersickness and its interplay with cognitive and motor skills. Methods: 30 participants, 20-45 years old, completed the MSSQ and the CSQ-VR, and were immersed in VR. During immersion, they were exposed to a roller coaster ride. Before and after the ride, participants responded to CSQ-VR and performed VR-based cognitive and psychomotor tasks. Post-VR session, participants completed the CSQ-VR again. Results: Motion sickness susceptibility, during adulthood, was the most prominent predictor of cybersickness. Pupil dilation emerged as a significant predictor of cybersickness. Experience in videogaming was a significant predictor of both cybersickness and cognitive/motor functions. Cybersickness negatively affected visuospatial working memory and psychomotor skills. Overall cybersickness', nausea and vestibular symptoms' intensities significantly decreased after removing the VR headset. Conclusions: In order of importance, motion sickness susceptibility and gaming experience are significant predictors of cybersickness. Pupil dilation appears as a cybersickness' biomarker. Cybersickness negatively affects visuospatial working memory and psychomotor skills. Cybersickness and its effects on performance should be examined during and not after immersion.Comment: 32 Pages, 4 figures, 14 Tables. The article has been submitted to Virtual Worlds Journa
    • 

    corecore