435 research outputs found

    Bimanual Size Estimation: No Automatic Integration of Information across the Hands

    Get PDF
    Sensory input is often integrated to gain a single estimate of the underlying physical property. Here we investigate if size estimates from the left and right hand are automatically integrated. Six subjects participated in a bimanual matching task. Subjects were presented (vir-tual) objects to be felt with either hand or with both hands. Their task was to reproduce the sizes after presentation. The bimanual stimuli either had the same size for each hand or there was a size con ict between the hands. We showed that there is no automatic integration and subjects retained access to both hands size estimates

    Roughness and spatial density judgments on visual and haptic textures using virtual reality

    No full text
    The purpose of this study is to investigate multimodal visual-haptic texture perception for which we used virtual reality techniques. Participants judged a broad range of textures according to their roughness and their spatial density under visual, haptic and visual-haptic exploration conditions. Participants were well able to differentiate between the different textures both by using the roughness and the spatial density judgment. When provided with visualhaptic textures, subjects performance increased (for both judgments) indicating sensory combination of visual and haptic texture information. Most interestingly, performance for density and roughness judgments did not differ significantly, indicating that these estimates are highly correlated. This may be due to the fact that our textures were generated in virtual reality using a haptic pointforce display (PHANToM). In conclusion, it seems that the roughness and spatial density estimate were based on the same physical parameters given the display technology used

    Effects of virtual acoustics on dynamic auditory distance perception

    Get PDF
    Sound propagation encompasses various acoustic phenomena including reverberation. Current virtual acoustic methods, ranging from parametric filters to physically-accurate solvers, can simulate reverberation with varying degrees of fidelity. We investigate the effects of reverberant sounds generated using different propagation algorithms on acoustic distance perception, i.e., how faraway humans perceive a sound source. In particular, we evaluate two classes of methods for real-time sound propagation in dynamic scenes based on parametric filters and ray tracing. Our study shows that the more accurate method shows less distance compression as compared to the approximate, filter-based method. This suggests that accurate reverberation in VR results in a better reproduction of acoustic distances. We also quantify the levels of distance compression introduced by different propagation methods in a virtual environment.Comment: 8 Pages, 7 figure

    Combining Locations from Working Memory and Long-Term Memory into a Common Spatial Image

    Get PDF
    This research uses a novel integration paradigm to investigate whether target locations read in from long-term memory (LTM) differ from perceptually encoded inputs in spatial working-memory (SWM) with respect to systematic spatial error and/or noise, and whether SWM can simultaneously encompass both of these sources. Our results provide evidence for a composite representation of space in SWM derived from both perception and LTM, albeit with a loss in spatial precision of locations retrieved from LTM. More generally, the data support the concept of a spatial image in working memory and extend its potential sources to representations retrieved from LTM

    Perception of 3-D Location Based on Vision, Touch, and Extended Touch

    Get PDF
    Perception of the near environment gives rise to spatial images in working memory that continue to represent the spatial layout even after cessation of sensory input. As the observer moves, these spatial images are continuously updated. This research is concerned with (1) whether spatial images of targets are formed when they are sensed using extended touch (i.e., using a probe to extend the reach of the arm) and (2) the accuracy with which such targets are perceived. In Experiment 1, participants perceived the 3-D locations of individual targets from a fixed origin and were then tested with an updating task involving blindfolded walking followed by placement of the hand at the remembered target location. Twenty-four target locations, representing all combinations of two distances, two heights, and six azimuths, were perceived by vision or by blindfolded exploration with the bare hand, a 1-m probe, or a 2-m probe. Systematic errors in azimuth were observed for all targets, reflecting errors in representing the target locations and updating. Overall, updating after visual perception was best, but the quantitative differences between conditions were small. Experiment 2 demonstrated that auditory information signifying contact with the target was not a factor. Overall, the results indicate that 3-D spatial images can be formed of targets sensed by extended touch and that perception by extended touch, even out to 1.75 m, is surprisingly accurate

    Touch-Screen Technology for the Dynamic Display of 2D Spatial Information Without Vision: Promise and Progress

    Get PDF
    Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface

    Using sound in multi-touch interfaces to change materiality and touch behavior

    Get PDF
    Current development in multimodal interfaces allows us to interact with digitally represented objects. Sadly, these representations are often poor due to technical limitations in representing some of the sensorial properties. Here we explore the possibility of overcoming these limitations by exploiting multisensory integration processes and propose a sound-based interaction technique to alter the perceived materiality of a surface being touched and to shape users' touch behavior. The latter can be seen both as a cue of, and as a means to reinforce, the altered perception. We designed a prototype that dynamically alters the texture-related sound feedback based on touch behavior, as in natural surface touch interactions. A user study showed that the frequency of the sound feedback alters texture perception (coldness and material type) and touch behavior (velocity and pressure). We conclude by discussing lessons learnt from this work in terms of HCI applications and questions opened by this research. Copyright is held by the owner/author(s)
    • …
    corecore