3,614 research outputs found
Generalized Movement Representation in Haptic Perception
The extraction of spatial information by touch often involves exploratory movements, with tactile and kinesthetic signals combined to construct a spatial haptic percept. However, the body has many sensory surfaces that can move independently, giving rise to the source binding problem: when there are multiple tactile signals originating from sensory surfaces with multiple movements, are the tactile and kinesthetic signals bound to one another? We studied haptic signal combination by applying the tactile signal to a stationary fingertip while another body part (the other hand or a foot) or a visual target moves, and using a task that can only be done if the tactile and kinesthetic signals are combined. We found that both direction and speed of movement transfer across limbs, but only direction transfers between visual target motion and the tactile signal. In control experiments, we excluded the role of explicit reasoning or knowledge of motion kinematics in this transfer. These results demonstrate the existence of two motion representations in the haptic system—one of direction and another of speed or amplitude—that are both source-free or unbound from their sensory surface of origin. These representations may well underlie our flexibility in haptic perception and sensorimotor control
Engineering data compendium. Human perception and performance. User's guide
The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use
Bodily awareness and novel multisensory features
According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space
Change blindness: eradication of gestalt strategies
Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task
Prop-Based Haptic Interaction with Co-location and Immersion: an Automotive Application
Most research on 3D user interfaces aims at providing only a single sensory
modality. One challenge is to integrate several sensory modalities into a
seamless system while preserving each modality's immersion and performance
factors. This paper concerns manipulation tasks and proposes a visuo-haptic
system integrating immersive visualization, tactile force and tactile feedback
with co-location. An industrial application is presented
Multimodal Human-Machine Interface For Haptic-Controlled Excavators
The goal of this research is to develop a human-excavator interface for the hapticcontrolled excavator that makes use of the multiple human sensing modalities (visual, auditory haptic), and efficiently integrates these modalities to ensure intuitive, efficient interface that is easy to learn and use, and is responsive to operator commands. Two empirical studies were conducted to investigate conflict in the haptic-controlled excavator interface and identify the level of force feedback for best operator performance
Beyond Gazing, Pointing, and Reaching: A Survey of Developmental Robotics
Developmental robotics is an emerging field located
at the intersection of developmental psychology
and robotics, that has lately attracted
quite some attention. This paper gives a survey of
a variety of research projects dealing with or inspired
by developmental issues, and outlines possible
future directions
- …