64,960 research outputs found

    Miniature Eye Movements Enhance Fine Spatial Details

    Full text link
    Our eyes are constantly in motion. Even during visual fixation, small eye movements continually jitter the location of gaze. It is known that visual percepts tend to fade when retinal image motion is eliminated in the laboratory. However, it has long been debated whether, during natural viewing, fixational eye movements have functions in addition to preventing the visual scene from fading. In this study, we analysed the influence in humans of fixational eye movements on the discrimination of gratings masked by noise that has a power spectrum similar to that of natural images. Using a new method of retinal image stabilization18, we selectively eliminated the motion of the retinal image that normally occurs during the intersaccadic intervals of visual fixation. Here we show that fixational eye movements improve discrimination of high spatial frequency stimuli, but not of low spatial frequency stimuli. This improvement originates from the temporal modulations introduced by fixational eye movements in the visual input to the retina, which emphasize the high spatial frequency harmonics of the stimulus. In a natural visual world dominated by low spatial frequencies, fixational eye movements appear to constitute an effective sampling strategy by which the visual system enhances the processing of spatial detail.National Institutes of Health; National Science Foundatio

    Mechanisms of Action and Targets of Nitric Oxide in the Oculomotor System

    Get PDF
    Nitric oxide (NO) production by neurons in the prepositus hypoglossi (PH) nucleus is necessary for the normal performance of eye movements in alert animals. In this study, the mechanism(s) of action of NO in the oculomotor system has been investigated. Spontaneous and vestibularly induced eye movements were recorded in alert cats before and after microinjections in the PH nucleus of drugs affecting the NO–cGMP pathway. The cellular sources and targets of NO were also studied by immunohistochemical detection of neuronal NO synthase (NOS) and NO-sensitive guanylyl cyclase, respectively. Injections of NOS inhibitors produced alterations of eye velocity, but not of eye position, for both spontaneous and vestibularly induced eye movements, suggesting that NO produced by PH neurons is involved in the processing of velocity signals but not in the eye position generation. The effect of neuronal NO is probably exerted on a rich cGMP-producing neuropil dorsal to the nitrergic somas in the PH nucleus. On the other hand, local injections of NO donors or 8-Br-cGMP produced alterations of eye velocity during both spontaneous eye movements and vestibulo-ocular reflex (VOR), as well as changes in eye position generation exclusively during spontaneous eye movements. The target of this additional effect of exogenous NO is probably a well defined group of NO-sensitive cGMP-producing neurons located between the PH and the medial vestibular nuclei. These cells could be involved in the generation of eye position signals during spontaneous eye movements but not during the VOR.Fondo de Investigación Sanitaria Grants 94/0388 and 97/2054Comunidad Autónoma de Madrid Grant 08.5/0019/1997Dirección General de Investigación Científica y Technológica Grant PB 93–117

    Reconstruction of eye movements during blinks

    Full text link
    In eye movement research in reading, the amount of data plays a crucial role for the validation of results. A methodological problem for the analysis of the eye movement in reading are blinks, when readers close their eyes. Blinking rate increases with increasing reading time, resulting in high data losses, especially for older adults or reading impaired subjects. We present a method, based on the symbolic sequence dynamics of the eye movements, that reconstructs the horizontal position of the eyes while the reader blinks. The method makes use of an observed fact that the movements of the eyes before closing or after opening contain information about the eyes movements during blinks. Test results indicate that our reconstruction method is superior to methods that use simpler interpolation approaches. In addition, analyses of the reconstructed data show no significant deviation from the usual behavior observed in readers

    WAYLA - Generating Images from Eye Movements

    Full text link
    We present a method for reconstructing images viewed by observers based only on their eye movements. By exploring the relationships between gaze patterns and image stimuli, the "What Are You Looking At?" (WAYLA) system learns to synthesize photo-realistic images that are similar to the original pictures being viewed. The WAYLA approach is based on the Conditional Generative Adversarial Network (Conditional GAN) image-to-image translation technique of Isola et al. We consider two specific applications - the first, of reconstructing newspaper images from gaze heat maps, and the second, of detailed reconstruction of images containing only text. The newspaper image reconstruction process is divided into two image-to-image translation operations, the first mapping gaze heat maps into image segmentations, and the second mapping the generated segmentation into a newspaper image. We validate the performance of our approach using various evaluation metrics, along with human visual inspection. All results confirm the ability of our network to perform image generation tasks using eye tracking data

    Modeling the Possible Influences of Eye Movements on the Refinement of Cortical Direction Selectivity

    Full text link
    The second-order statistics of neural activity was examined in a model of the cat LGN and V1 during free-viewing of natural images. In the model, the specific patterns of thalamocortical activity required for a Bebbian maturation of direction-selective cells in VI were found during the periods of visual fixation, when small eye movements occurred, but not when natural images were examined in the absence of fixational eye movements. In addition, simulations of stroboscopic reming that replicated the abnormal pattern of eye movements observed in kittens chronically exposed to stroboscopic illumination produced results consistent with the reported loss of direction selectivity and preservation of orientation selectivity. These results suggest the involvement of the oculomotor activity of visual fixation in the maturation of cortical direction selectivity

    Motion transparency : depth ordering and smooth pursuit eye movements

    Get PDF
    When two overlapping, transparent surfaces move in different directions, there is ambiguity with respect to the depth ordering of the surfaces. Little is known about the surface features that are used to resolve this ambiguity. Here, we investigated the influence of different surface features on the perceived depth order and the direction of smooth pursuit eye movements. Surfaces containing more dots, moving opposite to an adapted direction, moving at a slower speed, or moving in the same direction as the eyes were more likely to be seen in the back. Smooth pursuit eye movements showed an initial preference for surfaces containing more dots, moving in a non-adapted direction, moving at a faster speed, and being composed of larger dots. After 300 to 500 ms, smooth pursuit eye movements adjusted to perception and followed the surface whose direction had to be indicated. The differences between perceived depth order and initial pursuit preferences and the slow adjustment of pursuit indicate that perceived depth order is not determined solely by the eye movements. The common effect of dot number and motion adaptation suggests that global motion strength can induce a bias to perceive the stronger motion in the back

    A Computational Model of Spatial Memory Anticipation during Visual Search

    Get PDF
    Some visual search tasks require to memorize the location of stimuli that have been previously scanned. Considerations about the eye movements raise the question of how we are able to maintain a coherent memory, despite the frequent drastically changes in the perception. In this article, we present a computational model that is able to anticipate the consequences of the eye movements on the visual perception in order to update a spatial memor

    Brief report: how adolescents with ASD process social information in complex scenes. Combining evidence from eye movements and verbal descriptions

    Get PDF
    We investigated attention, encoding and processing of social aspects of complex photographic scenes. Twenty-four high-functioning adolescents (aged 11–16) with ASD and 24 typically developing matched control participants viewed and then described a series of scenes, each containing a person. Analyses of eye movements and verbal descriptions provided converging evidence that both groups displayed general interest in the person in each scene but the salience of the person was reduced for the ASD participants. Nevertheless, the verbal descriptions revealed that participants with ASD frequently processed the observed person’s emotion or mental state without prompting. They also often mentioned eye-gaze direction, and there was evidence from eye movements and verbal descriptions that gaze was followed accurately. The combination of evidence from eye movements and verbal descriptions provides a rich insight into the way stimuli are processed overall. The merits of using these methods within the same paradigm are discussed
    corecore