73 research outputs found
Crossmodal audio and tactile interaction with mobile touchscreens
Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device.
This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective.
A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent.
Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established.
The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems
Multimodal feedback for mid-air gestures when driving
Mid-air gestures in cars are being used by an increasing number of drivers on the road. Us-ability concerns mean good feedback is important, but a balance needs to be found between supporting interaction and reducing distraction in an already demanding environment. Visual feedback is most commonly used, but takes visual attention away from driving. This thesis investigates novel non-visual alternatives to support the driver during mid-air gesture interaction: Cutaneous Push, Peripheral Lights, and Ultrasound feedback. These modalities lack the expressive capabilities of high resolution screens, but are intended to allow drivers to focus on the driving task. A new form of haptic feedback — Cutaneous Push — was defined. Six solenoids were embedded along the rim of the steering wheel, creating three bumps under each palm. Studies 1, 2, and 3 investigated the efficacy of novel static and dynamic Cutaneous Push patterns, and their impact on driving performance. In simulated driving studies, the cutaneous patterns were tested. The results showed pattern identification rates of up to 81.3% for static patterns and 73.5% for dynamic patterns and 100% recognition of directional cues. Cutaneous Push notifications did not impact driving behaviour nor workload and showed very high user acceptance. Cutaneous Push patterns have the potential to make driving safer by providing non-visual and instantaneous messages, for example to indicate an approaching cyclist or obstacle. Studies 4 & 5 looked at novel uni- and bimodal feedback combinations of Visual, Auditory, Cutaneous Push, and Peripheral Lights for mid-air gestures and found that non-visual feedback modalities, especially when combined bimodally, offered just as much support for interaction without negatively affecting driving performance, visual attention and cognitive demand. These results provide compelling support for using non-visual feedback from in-car systems, supporting input whilst letting drivers focus on driving.Studies 6 & 7 investigated the above bimodal combinations as well as uni- and bimodal Ultrasound feedback during the Lane Change Task to assess the impact of gesturing and feedback modality on car control during more challenging driving. The results of study Seven suggests that Visual and Ultrasound feedback are not appropriate for in-car usage,unless combined multimodally. If Ultrasound is used unimodally it is more useful in a binary scenario.Findings from Studies 5, 6, and 7 suggest that multimodal feedback significantly reduces eyes-off-the-road time compared to Visual feedback without compromising driving performance or perceived user workload, thus it can potentially reduce crash risks. Novel design recommendations for providing feedback during mid-air gesture interaction in cars are provided, informed by the experiment findings
Recommended from our members
Human emotional response to automotive steering wheel vibration: development of a driver emotional semantic scale
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe 21st century automobile has become more than just a simple tool for transportation and more of a brand image or a way for drivers to express their personal taste. This has made it increasingly important for automotive manufacturers to design the driver experience and driver feeling so as to tailor their preferences and interests. Currently there is not enough information on how to design or brand the communication of meaningful feedback from the automobile to the driver. With the development of new advanced technologies such as electric steer-by-wire systems or electric automobiles, the need to provide meaningful feedback to the driver plays a central role in the experience of using the new driving technology. Thus it is important to understand how to assess the emotional response to the stimuli reaching the driver so to be able to optimise at later stage the perceived experience. Steering wheel vibration feedback plays an important role for the driver’s control input when driving. There is currently a lack of research on the formal assessment criteria of driver emotional response used to define automotive steering wheel vibration feedback, therefore this thesis proposes a newly Driver Emotional Semantic (DES) Scale to answer the research question: “How can the emotional response to steering wheel vibration be assessed?”. This study starts with a comparison of a questionnaire survey (Exp.1) and a laboratory test (Exp.2) to identify if a correlation exists between the emotional ratings measured from the expected driver’s perception of the vibration and the experienced emotional feeling of steering wheel vibration. The work then defines a semantic scale to capture the vibrational vocabulary used by the driver to express their feeling of perceived vibration during real-road driving scenarios. Experiment 3 was therefore carried out to gather the underlying semantic descriptors used by drivers during driving scenarios. To test the reliability of the descriptive pairs of the DES rating scale developed, two evaluations of the assessment criteria were carried out: in real road scenarios (Exp.4) and laboratory test setting (Exp.5). Current research findings of this thesis suggest that the consistency of the scale dimensions found in the field study has captured with greater accuracy the driver semantic experience of automotive steering wheel vibration character as compared to the laboratory experiment dimensionality. Results suggest that the main vibrotactile semantic descriptors to assess the human emotional response to automotive steering wheel vibration were found to be four: pleasant, smooth, sharp and powerful. The final proposed DES scale could help automotive research and industry determine and customise the aspects of the automobile towards drivers’ preferences of felt experience
Human factors considerations for ultrasound induced mid-air haptic feedback
The engineering design process can be complex and often involves reiteration of design activities in order to improve outcomes. Traditionally, the design process consists of many physical elements, for example, clay/foam modelling and more recently Additive Manufacturing (AM), with an iterative cycle of user testing of these physical prototypes. The time associated with creating physical prototypes can lengthen the time it takes to develop one product, and thus, comes at a burdensome financial and labour cost. Due to the aforementioned constraints of the conventional design process, more research is being conducted into applications of Virtual Reality (VR) to complement stages of the design process that would otherwise take and cost a significant amount of time and money. VR enables users to create 3D virtual designs and prototypes for evaluation, thus facilitating the rapid correction of design and usability issues. However, VR is not without its pitfalls, for example, it often only facilitates an audio-visual simulation, thus hindering evaluation of the tactile element of design, which is critical to the success of many products.
This issue already has a wide body of research associated with it, which explores applications of haptic (tactile) feedback to VR to create a more realistic and accurate virtual experience. However, current haptic technologies can be expensive, cumbersome, hard to integrate with existing design tools, and have limited sensorial output (for example, vibrotactile feedback). Ultrasound Haptic Feedback (UsHF) appears to be a promising technology that offers affordable, unencumbered, integrable and versatile use. The technology achieves this by using ultrasound to create mid-air haptic feedback which users can feel without being attached to a device. However, due to the novel nature of the technology, there is little to no literature dedicated to investigating how users perceive and interpret UsHF stimuli, and how their perception affects the user experience.
The research presented in this thesis concerns the human factors of UsHF for engineering design applications. The PhD was borne out of interest from Ultraleap (previously Ultrahaptics), an SME technology developer, on how their mid-air haptic feedback device could be used within the field of engineering. Six studies (five experimental and one qualitative) were conducted in order to explore the human factors of UsHF, with a view of understanding its viability for use in engineering design. This was achieved by exploring the tactile ability of users in mid-air object size discrimination, absolute tactile thresholds, perception of intensity differences, and normalisation of UsHF intensity. These measures were also tested against individual differences in age, gender and fingertip/hand size during the early stages, with latter stages focussing on the same measures when UsHF was compared to 2D multimodal and physical environments.
The findings demonstrated no evidence of individual differences in UsHF tactile acuity and perception of UsHF stimuli. However, the results did highlight clear limitations in object size discrimination and absolute tactile thresholds. Interestingly, the results also demonstrated psychophysical variation in the perception of UsHF intensity differences, with intensity differences having a significant effect on how object size is perceived. Comparisons between multimodal UsHF and physical size discrimination were also conducted and found size discrimination accuracy of physical objects to be better than visuo-haptic (UsHF) size discrimination. Qualitative studies revealed an optimistic attitude towards VR for engineering design applications, particularly within the design, review, and prototyping stages, with many suggesting the addition of haptic feedback could be beneficial to the process.
This thesis offers a novel contribution to the field of human factors for mid-air haptics, and in particular for the use of this technology as part of the engineering design process. The results indicate that UsHF in its current state could not offer a replacement for all physical prototypes within the design process; however, UsHF may still have a place in the virtual design process where haptic feedback is required but is less reliant on the accurate portrayal of virtual objects, for example, during early stage evaluations supplemented by later physical prototypes, simply to indicate contact with virtual objects, or when sharing designs with stakeholders and multidisciplinary teams
Human factors considerations for ultrasound induced mid-air haptic feedback
The engineering design process can be complex and often involves reiteration of design activities in order to improve outcomes. Traditionally, the design process consists of many physical elements, for example, clay/foam modelling and more recently Additive Manufacturing (AM), with an iterative cycle of user testing of these physical prototypes. The time associated with creating physical prototypes can lengthen the time it takes to develop one product, and thus, comes at a burdensome financial and labour cost. Due to the aforementioned constraints of the conventional design process, more research is being conducted into applications of Virtual Reality (VR) to complement stages of the design process that would otherwise take and cost a significant amount of time and money. VR enables users to create 3D virtual designs and prototypes for evaluation, thus facilitating the rapid correction of design and usability issues. However, VR is not without its pitfalls, for example, it often only facilitates an audio-visual simulation, thus hindering evaluation of the tactile element of design, which is critical to the success of many products.
This issue already has a wide body of research associated with it, which explores applications of haptic (tactile) feedback to VR to create a more realistic and accurate virtual experience. However, current haptic technologies can be expensive, cumbersome, hard to integrate with existing design tools, and have limited sensorial output (for example, vibrotactile feedback). Ultrasound Haptic Feedback (UsHF) appears to be a promising technology that offers affordable, unencumbered, integrable and versatile use. The technology achieves this by using ultrasound to create mid-air haptic feedback which users can feel without being attached to a device. However, due to the novel nature of the technology, there is little to no literature dedicated to investigating how users perceive and interpret UsHF stimuli, and how their perception affects the user experience.
The research presented in this thesis concerns the human factors of UsHF for engineering design applications. The PhD was borne out of interest from Ultraleap (previously Ultrahaptics), an SME technology developer, on how their mid-air haptic feedback device could be used within the field of engineering. Six studies (five experimental and one qualitative) were conducted in order to explore the human factors of UsHF, with a view of understanding its viability for use in engineering design. This was achieved by exploring the tactile ability of users in mid-air object size discrimination, absolute tactile thresholds, perception of intensity differences, and normalisation of UsHF intensity. These measures were also tested against individual differences in age, gender and fingertip/hand size during the early stages, with latter stages focussing on the same measures when UsHF was compared to 2D multimodal and physical environments.
The findings demonstrated no evidence of individual differences in UsHF tactile acuity and perception of UsHF stimuli. However, the results did highlight clear limitations in object size discrimination and absolute tactile thresholds. Interestingly, the results also demonstrated psychophysical variation in the perception of UsHF intensity differences, with intensity differences having a significant effect on how object size is perceived. Comparisons between multimodal UsHF and physical size discrimination were also conducted and found size discrimination accuracy of physical objects to be better than visuo-haptic (UsHF) size discrimination. Qualitative studies revealed an optimistic attitude towards VR for engineering design applications, particularly within the design, review, and prototyping stages, with many suggesting the addition of haptic feedback could be beneficial to the process.
This thesis offers a novel contribution to the field of human factors for mid-air haptics, and in particular for the use of this technology as part of the engineering design process. The results indicate that UsHF in its current state could not offer a replacement for all physical prototypes within the design process; however, UsHF may still have a place in the virtual design process where haptic feedback is required but is less reliant on the accurate portrayal of virtual objects, for example, during early stage evaluations supplemented by later physical prototypes, simply to indicate contact with virtual objects, or when sharing designs with stakeholders and multidisciplinary teams
- …