3,983 research outputs found

    Visuomotor Adaptation Without Vision?

    Get PDF
    In 1995, an aftereffect following treadmill running was described, in which people would inadvertently advance when attempting to run in place on solid ground with their eyes closed. Although originally induced from treadmill running, the running-in-place aftereffect is argued here to result from the absence of sensory information specifying advancement during running. In a series of experiments in which visual information was systematically manipulated, aftereffect strength (AE), measured as the proportional increase (post-test/pre-test) in forward drift while attempting to run in place with eyes closed, was found to be inversely related to the amount of geometrically correct optical flow provided during induction. In particular, experiment 1 (n=20) demonstrated that the same aftereffect was not limited to treadmill running, but could also be strongly generated by running behind a golf-cart when the eyes were closed (AE=1.93), but not when the eyes were open (AE=1.16). Conversely, experiment 2 (n=39) showed that simulating an expanding flow field, albeit crudely, during treadmill running was insufficient to eliminate the aftereffect. Reducing ambient auditory information by means of earplugs increased the total distances inadvertently advanced while attempting to run in one place by a factor of two, both before and after adaptation, but did not influence the ratio of change produced by adaptation. It is concluded that the running-in-place aftereffect may result from a recalibration of visuomotor control systems that takes place even in the absence of visual input

    More than just perception-action recalibration: walking through a virtual environment causes rescaling of perceived space.

    Get PDF
    Egocentric distances in virtual environments are commonly underperceived by up to 50 % of the intended distance. However, a brief period of interaction in which participants walk through the virtual environment while receiving visual feedback can dramatically improve distance judgments. Two experiments were designed to explore whether the increase in postinteraction distance judgments is due to perception–action recalibration or the rescaling of perceived space. Perception–action recalibration as a result of walking interaction should only affect action-specific distance judgments, whereas rescaling of perceived space should affect all distance judgments based on the rescaled percept. Participants made blind-walking distance judgments and verbal size judgments in response to objects in a virtual environment before and after interacting with the environment through either walking (Experiment 1) or reaching (Experiment 2). Size judgments were used to infer perceived distance under the assumption of size–distance invariance, and these served as an implicit measure of perceived distance. Preinteraction walking and size-based distance judgments indicated an underperception of egocentric distance, whereas postinteraction walking and size-based distance judgments both increased as a result of the walking interaction, indicating that walking through the virtual environment with continuous visual feedback caused rescaling of the perceived space. However, interaction with the virtual environment through reaching had no effect on either type of distance judgment, indicating that physical translation through the virtual environment may be necessary for a rescaling of perceived space. Furthermore, the size-based distance and walking distance judgments were highly correlated, even across changes in perceived distance, providing support for the size–distance invariance hypothesis

    Visual Learning In The Perception Of Texture: Simple And Contingent Aftereffects Of Texture Density

    Get PDF
    Novel results elucidating the magnitude, binocularity and retinotopicity of aftereffects of visual texture density adaptation are reported as is a new contingent aftereffect of texture density which suggests that the perception of visual texture density is quite malleable. Texture aftereffects contingent upon orientation, color and temporal sequence are discussed. A fourth effect is demonstrated in which auditory contingencies are shown to produce a different kind of visual distortion. The merits and limitations of error-correction and classical conditioning theories of contingent adaptation are reviewed. It is argued that a third kind of theory which emphasizes coding efficiency and informational considerations merits close attention. It is proposed that malleability in the registration of texture information can be understood as part of the functional adaptability of perception

    Auditory environmental context affects visual distance perception

    Get PDF
    In this article, we show that visual distance perception (VDP) is influenced by the auditory environmental context through reverberation-related cues. We performed two VDP experiments in two dark rooms with extremely different reverberation times: an anechoic chamber and a reverberant room. Subjects assigned to the reverberant room perceived the targets farther than subjects assigned to the anechoic chamber. Also, we found a positive correlation between the maximum perceived distance and the auditorily perceived room size. We next performed a second experiment in which the same subjects of Experiment 1 were interchanged between rooms. We found that subjects preserved the responses from the previous experiment provided they were compatible with the present perception of the environment; if not, perceived distance was biased towards the auditorily perceived boundaries of the room. Results of both experiments show that the auditory environment can influence VDP, presumably through reverberation cues related to the perception of room size.Fil: Etchemendy, Pablo Esteban. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Abregú, Ezequiel Lucas. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Calcagno, Esteban. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Eguia, Manuel Camilo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Vechiatti, Nilda. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Iasi, Federico. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; ArgentinaFil: Vergara, Ramiro Oscar. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; Argentin

    Perceiving audiovisual synchrony as a function of stimulus distance

    Get PDF
    Dissertação de mestrado integrado em Psicologia (área de especialização em Psicologia Experimental)Audiovisual perception is still an intriguing phenomenon, especially when we think about the physical and neuronal differences underlying the perception of sound and light. Physically, there is a delay of ~3ms/m between the emission of a sound and its arrival to the observer, whereas speed of light makes its delay negligible. On the other hand, we know that acoustic transduction is a very fast process (~1ms) while photo-transduction is quite slow (~50 ms). Nevertheless, audio and visual stimuli that are temporally mismatched can be perceived as a coherent audiovisual stimulus, although a sound delay is often required to achieve a better perception. A Point of Subjective Simultaneity (PSS) that requires a sound delay might point both to a perceptual mechanism that compensates for physical differences or to one that compensates for the transduction differences, in the perception of audiovisual synchrony. In this study we analyze the PSS as a function of stimulus distance to understand if individuals take into account sound velocity or if they compensate for differences in transduction time when judging synchrony. Using Point Light Walkers (PLW) as visual stimuli and sound of steps as audio stimuli, we developed presentations in a virtual reality environment with several temporal alignments between sound and image (-285ms to +300ms of audio asynchrony in steps of 30 ms) at different distances from the observer (10, 15, 20, 25, 30, 25 meters) in conditions which differ in the number of depth cues. The results show a relation between PSS and stimulation distance that is congruent with differences in velocity of propagation between sound and light (Experiment 1). Therefore, it appears that perception of synchrony across several distances is made possible by the existence of a compensatory mechanism for the slower velocity of sound, relative to light. Moreover, the number and quality of depth cues appears to be of great importance in the triggering of such a compensatory mechanism (Experiment 2).A percepção audiovisual é um fenómeno curioso, especialmente quando consideramos as diferenças físicas e neuronais subjacentes à percepção do som e da luz. Fisicamente, há um atraso de cerca de 3 ms/m entre a emissão de um som e a sua chegada ao observador, enquanto a velocidade da luz torna o seu atraso negligenciável. Por outro lado, sabemos que a transdução de um estímulo sonoro é um processo muito rápido (~1 ms) enquanto que a foto-transdução é um processo relativamente lento (~50 ms). Apesar destas diferenças, sabemos que estímulos auditivos e visuais temporalmente desalinhados podem ser percebidos como um estímulo audiovisual coerente. No entanto, para que tal aconteça, um atraso do som em relação à imagem é frequentemente necessário. Um Ponto de Simultaneidade Subjectiva (PSS) que requer um atraso do som pode ser um indício da existência tanto de um mecanismo perceptual que compensa as diferenças físicas, como de um mecanismo perceptual que compense as diferenças neuronais, na percepção de sincronia audiovisual. Neste estudo analisamos o PSS em função da distância de estimulação para perceber se temos em conta a velocidade do som ou se compensamos as diferenças ao nível dos processos de transdução, quando estamos a julgar a sincronia entre um estímulo auditivo e um visual. Usando Point Light Walkers (PLW) como estímulo visual e som de passos como estímulo sonoro desenvolvemos apresentações em ambiente de realidade virtual, com diferentes alinhamentos entre som e imagem (de -285ms a +300ms, em passos de 30 ms, de assincronia do audio) e a várias distâncias do observador (10, 15, 20, 25, 30, 25 metros), em condições que variavam segundo o número de pistas de profundidade apresentadas. Os dados mostram que há uma relação positiva entre PSS e distância de estimulação congruente com as diferenças entre som e luz, ao nível da velocidade de propagação (Experiência 1). Desta forma, parece-nos que a percepção de sincronia audiovisual ao longo de várias distâncias é possível através da existência de um mecanismo de compensação para a velocidade do som, mais lenta em relação à da luz. O número e qualidade das pistas de profundidade parecem também ter uma grande importância na activação deste mecanismo de compensação (Experiência 2)

    A systematic review on perceptual-motor calibration to changes in action capabilities

    Get PDF
    Perceptual-motor calibration has been described as a mapping between perception and action, which is relevant to distinguish possible from impossible opportunities for action. To avoid movement errors, it is relevant to rapidly calibrate to immediate changes in capabilities and therefore this study sought to explain in what conditions calibration is most efficient. A systematic search of seven databases was conducted to identify literature concerning changes in calibration in response to changes in action capabilities. Twenty-three papers satisfied the inclusion criteria. Data revealed that calibration occurs rapidly if there is a good match between the task that requires calibration and the sources of perceptual-motor information available for exploration (e.g. when exploring maximal braking capabilities by experiencing braking). Calibration can take more time when the perceptual-motor information that is available is less relevant. The current study identified a number of limitations in the field of perceptual-motor research. Most notably, the mean participant age in the included studies was between 18 and 33 years of age, limiting the generalizability of the results to other age groups. Also, due to inconsistent terminology used in the field of perceptual-motor research, we argue that investigating calibration in older cohorts should be a focus of future research because of the possible implications of impaired calibration in an aging society
    corecore