2,041 research outputs found

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Should learners use their hands for learning? Results from an eye‐tracking study

    Get PDF
    Given the widespread use of touch screen devices, the effect of the users' fingers on information processing and learning is of growing interest. The present study drew on cognitive load theory and embodied cognition perspectives to investigate the effects of pointing and tracing gestures on the surface of a multimedia learning instruction. Learning performance, cognitive load and visual attention were examined in a one‐factorial experimental design with the between‐subject factor pointing and tracing gestures. The pointing and tracing group were instructed to use their fingers during the learning phase to make connections between corresponding text and picture information, whereas the control group was instructed not to use their hands for learning. The results showed a beneficial effect of pointing and tracing gestures on learning performance, a significant shift in visual attention and deeper processing of information by the pointing and tracing group, but no effect on subjective ratings of cognitive load. Implications for future research and practice are discussed

    Haptic feedback in eye typing

    Get PDF
    Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedback; both were easy to perceive and participants liked both the auditory ”click” and the tactile “tap” of the selected key. Implementation details (such as the placement of the haptic actuator) were also found important

    Digital haptics improve speed of visual search performance in a dual-task setting.

    Get PDF
    Dashboard-mounted touchscreen tablets are now common in vehicles. Screen/phone use in cars likely shifts drivers' attention away from the road and contributes to risk of accidents. Nevertheless, vision is subject to multisensory influences from other senses. Haptics may help maintain or even increase visual attention to the road, while still allowing for reliable dashboard control. Here, we provide a proof-of-concept for the effectiveness of digital haptic technologies (hereafter digital haptics), which use ultrasonic vibrations on a tablet screen to render haptic perceptions. Healthy human participants (N = 25) completed a divided-attention paradigm. The primary task was a centrally-presented visual conjunction search task, and the secondary task entailed control of laterally-presented sliders on the tablet. Sliders were presented visually, haptically, or visuo-haptically and were vertical, horizontal or circular. We reasoned that the primary task would be performed best when the secondary task was haptic-only. Reaction times (RTs) on the visual search task were fastest when the tablet task was haptic-only. This was not due to a speed-accuracy trade-off; there was no evidence for modulation of VST accuracy according to modality of the tablet task. These results provide the first quantitative support for introducing digital haptics into vehicle and similar contexts
    • 

    corecore