54 research outputs found

    The discrimination of abrupt changes in speed and direction of visual motion

    Get PDF
    AbstractA random dot pattern that moved within an invisible aperture was used to present two motions contiguously in time. The motions differed slightly either in speed (Experiments 1 and 3) or in direction (Experiments 2 and 4) and the subject had to discriminate the sign of the change (e.g. increment or decrement). The same discrimination task was performed when the two motions were temporally separated by 1 s. In Experiments 1 and 2 discrimination thresholds were measured with motion durations of 0.125, 0.25, 0.5 and 1.0 s and mean speeds of 2, 4, 8, and 16°/s. In Experiments 3 and 4 thresholds were measured with aperture widths of 5 and 20 cm. The discrimination of contiguous motions progressively deteriorated with decreasing duration and mean speed of motion. For the lowest value of duration the Weber fraction for contiguous speeds was more than three times as the Weber fractions for separate speeds. For the same low value of duration the thresholds for discrimination of direction of contiguous motions were only about 50% higher than the thresholds for separate motions. The Weber fraction for contiguous speeds was ca. three times higher with the smaller aperture than with the larger one, provided the ratio ‘aperture width/mean speed’ (i.e. the lifetime of the moving dots) was less than 0.3 s. Aperture width did not affect the discrimination of direction of contiguous motions. The discrimination of contiguous motions is discussed together with the known data for detection of changes in speed and direction. It is suggested that both, detection of changes in speed and discrimination of the sign of speed changes, may be performed by a common visual mechanism

    Animated Edge Textures in Node-Link Diagrams: a Design Space and Initial Evaluation

    Get PDF
    International audienceNetwork edge data attributes are usually encoded using color, opacity, stroke thickness and stroke pattern, or some combination thereof. In addition to these static variables, it is also possible to animate dynamic particles flowing along the edges. This opens a larger design space of animated edge textures, featuring additional visual encodings that have potential not only in terms of visual mapping capacity but also playfulness and aesthetics. Such animated edge textures have been used in several commercial and design-oriented visualizations, but to our knowledge almost always in a relatively ad hoc manner. We introduce a design space and Web-based framework for generating animated edge textures, and report on an initial evaluation of particle properties – particle speed, pattern and frequency – in terms of visual perception

    Perceptual judgment and saccadic behavior in a spatial distortion with briefly presented stimuli.

    Get PDF
    When observers are asked to localize the peripheral position of a small probe with respect to the mid-position of a spatially extended comparison stimulus, they tend to judge the probe as being more peripheral than the mid-position of the comparison stimulus. This relative mislocalization seems to emerge from differences in absolute localization, that is the comparison stimulus is localized more towards the fovea than the probe. The present study compared saccadic behaviour and relative localization judgements in three experiments and determined the quantitative relationship between both measures. The results showed corresponding effects in localization errors and saccadic behaviour. Moreover, it was possible to estimate the amount of the relative mislocalization by means of the saccadic amplitude

    The reference frame for encoding and retention of motion depends on stimulus set size

    Get PDF
    YesThe goal of this study was to investigate the reference frames used in perceptual encoding and storage of visual motion information. In our experiments, observers viewed multiple moving objects and reported the direction of motion of a randomly selected item. Using a vector-decomposition technique, we computed performance during smooth pursuit with respect to a spatiotopic (nonretinotopic) and to a retinotopic component and compared them with performance during fixation, which served as the baseline. For the stimulus encoding stage, which precedes memory, we found that the reference frame depends on the stimulus set size. For a single moving target, the spatiotopic reference frame had the most significant contribution with some additional contribution from the retinotopic reference frame. When the number of items increased (Set Sizes 3 to 7), the spatiotopic reference frame was able to account for the performance. Finally, when the number of items became larger than 7, the distinction between reference frames vanished. We interpret this finding as a switch to a more abstract nonmetric encoding of motion direction. We found that the retinotopic reference frame was not used in memory. Taken together with other studies, our results suggest that, whereas a retinotopic reference frame may be employed for controlling eye movements, perception and memory use primarily nonretinotopic reference frames. Furthermore, the use of nonretinotopic reference frames appears to be capacity limited. In the case of complex stimuli, the visual system may use perceptual grouping in order to simplify the complexity of stimuli or resort to a nonmetric abstract coding of motion information

    Misperceptions in the Trajectories of Objects undergoing Curvilinear Motion

    Get PDF
    Trajectory perception is crucial in scene understanding and action. A variety of trajectory misperceptions have been reported in the literature. In this study, we quantify earlier observations that reported distortions in the perceived shape of bilinear trajectories and in the perceived positions of their deviation. Our results show that bilinear trajectories with deviation angles smaller than 90 deg are perceived smoothed while those with deviation angles larger than 90 degrees are perceived sharpened. The sharpening effect is weaker in magnitude than the smoothing effect. We also found a correlation between the distortion of perceived trajectories and the perceived shift of their deviation point. Finally, using a dual-task paradigm, we found that reducing attentional resources allocated to the moving target causes an increase in the perceived shift of the deviation point of the trajectory. We interpret these results in the context of interactions between motion and position systems

    Neural correlates of audiovisual motion capture

    Get PDF
    Visual motion can affect the perceived direction of auditory motion (i.e., audiovisual motion capture). It is debated, though, whether this effect occurs at perceptual or decisional stages. Here, we examined the neural consequences of audiovisual motion capture using the mismatch negativity (MMN), an event-related brain potential reflecting pre-attentive auditory deviance detection. In an auditory-only condition occasional changes in the direction of a moving sound (deviant) elicited an MMN starting around 150 ms. In an audiovisual condition, auditory standards and deviants were synchronized with a visual stimulus that moved in the same direction as the auditory standards. These audiovisual deviants did not evoke an MMN, indicating that visual motion reduced the perceptual difference between sound motion of standards and deviants. The inhibition of the MMN by visual motion provides evidence that auditory and visual motion signals are integrated at early sensory processing stages

    A First- and Second-Order Motion Energy Analysis of Peripheral Motion Illusions Leads to Further Evidence of “Feature Blur” in Peripheral Vision

    Get PDF
    Anatomical and physiological differences between the central and peripheral visual systems are well documented. Recent findings have suggested that vision in the periphery is not just a scaled version of foveal vision, but rather is relatively poor at representing spatial and temporal phase and other visual features. Shapiro, Lu, Huang, Knight, and Ennis (2010) have recently examined a motion stimulus (the “curveball illusion”) in which the shift from foveal to peripheral viewing results in a dramatic spatial/temporal discontinuity. Here, we apply a similar analysis to a range of other spatial/temporal configurations that create perceptual conflict between foveal and peripheral vision.To elucidate how the differences between foveal and peripheral vision affect super-threshold vision, we created a series of complex visual displays that contain opposing sources of motion information. The displays (referred to as the peripheral escalator illusion, peripheral acceleration and deceleration illusions, rotating reversals illusion, and disappearing squares illusion) create dramatically different perceptions when viewed foveally versus peripherally. We compute the first-order and second-order directional motion energy available in the displays using a three-dimensional Fourier analysis in the (x, y, t) space. The peripheral escalator, acceleration and deceleration illusions and rotating reversals illusion all show a similar trend: in the fovea, the first-order motion energy and second-order motion energy can be perceptually separated from each other; in the periphery, the perception seems to correspond to a combination of the multiple sources of motion information. The disappearing squares illusion shows that the ability to assemble the features of Kanisza squares becomes slower in the periphery.The results lead us to hypothesize “feature blur” in the periphery (i.e., the peripheral visual system combines features that the foveal visual system can separate). Feature blur is of general importance because humans are frequently bringing the information in the periphery to the fovea and vice versa

    When here becomes there: attentional distribution modulates foveal bias in peripheral localization

    Get PDF
    Much research concerning attention has focused on changes in the perceptual qualities of objects while attentional states were varied. Here, we address a complementary question—namely, how perceived location can be altered by the distribution of sustained attention over the visual field. We also present a new way to assess the effects of distributing spatial attention across the visual field. We measured magnitude judgments relative to an aperture edge to test perceived location across a large range of eccentricities (30°), and manipulated spatial uncertainty in target locations to examine perceived location under three different distributions of spatial attention. Across three experiments, the results showed that changing the distribution of sustained attention significantly alters known foveal biases in peripheral localization
    corecore