4,606 research outputs found

    Amplifying head movements with head-mounted displays

    Get PDF

    Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain

    Get PDF
    Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR

    Spatial audio in small display screen devices

    Get PDF
    Our work addresses the problem of (visual) clutter in mobile device interfaces. The solution we propose involves the translation of technique-from the graphical to the audio domain-for expliting space in information representation. This article presents an illustrative example in the form of a spatialisedaudio progress bar. In usability tests, participants performed background monitoring tasks significantly more accurately using this spatialised audio (a compared with a conventional visual) progress bar. Moreover, their performance in a simultaneously running, visually demanding foreground task was significantly improved in the eye-free monitoring condition. These results have important implications for the design of multi-tasking interfaces for mobile devices

    Visual circuit flying with augmented head-tracking on limited field of view flight training devices

    Get PDF
    The virtual reality technique of amplified head rotations was applied to a fixed-base, low-fidelity flight simulator enabling users to fly a visual flying circuit, a task previously severely restricted by limited field of views and fixed field of regard. An exploratory experi- ment with nine pilots was conducted to test this technique on a fixed-base simulator across three displays: single monitor, triple monitor and triple projector. Participants started airborne downwind in a visual circuit with the primary task of completing the circuit to a full stop landing while having a secondary task of spotting popup traffic in the vicinity simulated by blimps. Data was collected to study effects on flight performance, workload and simulator sickness. The results showed that there were very few significant difference between displays, in itself remarkable considering the difference in display size and field of views. Triple monitor was found to be the best compromise delivering flight performance and traffic detection scores just below triple projector but without some peculiar track deviations during flight and a less chance of simulator sickness. With participants quickly adapting to this technique and favorable feedback, these findings demonstrated the poten- tial value of upgrading flight training devices and to improve their utility and pave the way for future research into this domain

    Augmenting low-fidelity flight simulation training devices via amplified head rotations

    Get PDF
    Due to economic and operational constraints, there is an increasing demand from aviation operators and training manufacturers to extract maximum training usage from the lower fidelity suite of flight simulators. It is possible to augment low-fidelity flight simulators to achieve equivalent performance compared to high-fidelity setups but at reduced cost and greater mobility. In particular for visual manoeuvres, the virtual reality technique of head-tracking amplification for virtual view control enables full field-of-regard access even with limited field-of-view displays. This research quantified the effects of this technique on piloting performance, workload and simulator sickness by applying it to a fixed-base, low-fidelity, low-cost flight simulator. In two separate simulator trials, participants had to land a simulated aircraft from a visual traffic circuit pattern whilst scanning for airborne traffic. Initially, a single augmented display was compared to the common triple display setup in front of the pilot. Starting from the base leg, pilots exhibited tighter turns closer to the desired ground track and were more actively conducting visual scans using the augmented display. This was followed up by a second experiment to quantify the scalability of augmentation towards larger displays and field of views. Task complexity was increased by starting the traffic pattern from the downwind leg. Triple displays in front of the pilot yielded the best compromise delivering flight performance and traffic detection scores just below the triple projectors but without an increase in track deviations and the pilots were also less prone to simulator sickness symptoms. This research demonstrated that head augmentation yields clear benefits of quick user adaptation, low-cost, ease of systems integration, together with the capability to negate the impact of display sizes yet without incurring significant penalties in workload and incurring simulator sickness. The impact of this research is that it facilitates future flight training solutions using this augmentation technique to meet budgetary and mobility requirements. This enables deployment of simulators in large numbers to deliver expanded mission rehearsal previously unattainable within this class of low-fidelity simulators, and with no restrictions for transfer to other training media

    Hair Bundles of a Jawless Vertebrate Employ Tetrapod-Like Tuned Mechanical Amplification

    Get PDF
    In the hearing and balance organs of tetrapod vertebrates, mechanical signals are transduced by an elegant organelle called the hair bundle. Deflections of this structure apply forces to mechanically gated ion channels. Hair bundles are not passive receivers of stimuli, but are instead active participants in the process of sensory transduction. They expend chemical energy to exert mechanical work, and can harness this active process to amplify their mechanical response to stimuli. Furthermore, the active process is tuned, allowing a given hair bundle to preferentially amplify a particular frequency; this feature is valuable in the analysis of complex sounds. Hair bundles can also enter an unstable regime in which their active process drives spontaneous oscillations. Studying this epiphenomenon can reveal mechanisms underlying the amplifying abilities of hair bundles. Despite the importance of amplification in hearing, little is known regarding the evolution of the active process; it is unclear if the active process is exclusive to tetrapods. It would be instructive, for instance, to know whether the active process predates the array of auditory specializations seen throughout vertebrates. Here, we approach this problem by investigating the mechanical activity of the hair bundles from the inner ears of two jawless vertebrates, the sea lamprey Petromyzon marinus and the American brook lamprey Lampetra appendix. We observe spontaneous oscillations in both of these animals. In the latter species, we also show evidence that their oscillations stem from mechanisms similar to those driving the spontaneous oscillations of tetrapod vertebrates. Furthermore, we found that hair bundles exhibiting these movements can entrain to and mechanically amplify particular stimulus frequencies. Taken together, our findings from a group distantly related to the tetrapods suggest that the active process of hair bundles is trait ancestral to all vertebrate ears

    Full color hybrid display for aircraft simulators

    Get PDF
    A full spectrum color monitor, connected to the camera and lens system of a television camera supported by a gantry frame over a terrain model simulating an aircraft landing zone, projects the monitor image onto a lens or screen visually accessible to a trainee in the simulator. A digital computer produces a pattern corresponding to the lights associated with the landing strip onto a monochromatic display, and an optical system projects the calligraphic image onto the same lens so that it is superposed on the video representation of the landing field. The optical system includes a four-color wheel which is rotated between the calligraphic display and the lens, and an apparatus for synchronizing the generation of a calligraphic pattern with the color segments on the color wheel. A servo feedback system responsive to the servo motors on the gantry frame produces an input to the computer so that the calligraphically generated signal corresponds in shape, size and location to the video signal

    Enhancing Visual Exploration through Augmented Gaze:High Acceptance of Immersive Virtual Biking by Oldest Olds

    Get PDF
    The diffusion of virtual reality applications dedicated to aging urges us to appraise its acceptance by target populations, especially the oldest olds. We investigated whether immersive virtual biking, and specifically a visuomotor manipulation aimed at improving visual exploration (augmented gaze), was well accepted by elders living in assisted residences. Twenty participants (mean age 89.8 years, five males) performed three 9 min virtual biking sessions pedalling on a cycle ergometer while wearing a Head-Mounted Display which immersed them inside a 360-degree pre-recorded biking video. In the second and third sessions, the relationship between horizontal head rotation and contingent visual shift was experimentally manipulated (augmented gaze), the visual shift being twice (gain = 2.0) or thrice (gain = 3.0) the amount of head rotation. User experience, motion sickness and visual exploration were measured. We found (i) very high user experience ratings, regardless of the gain; (ii) no effect of gain on motion sickness; and (iii) increased visual exploration (slope = +46%) and decreased head rotation (slope = −18%) with augmented gaze. The improvement in visual exploration capacity, coupled with the lack of intolerance signs, suggests that augmented gaze can be a valuable tool to improve the “visual usability” of certain virtual reality applications for elders, including the oldest olds
    • …
    corecore