138 research outputs found

    Development of augmented reality serious games with a vibrotactile feedback jacket

    Get PDF
    Background: In the past few years, augmented reality (AR) has rapidly advanced and has been applied in different fields. One of the successful AR applications is the immersive and interactive serious games, which can be used for education and learning purposes. Methods: In this project, a prototype of an AR serious game is developed and demonstrated. Gamers utilize a head-mounted device and a vibrotactile feedback jacket to explore and interact with the AR serious game. Fourteen vibration actuators are embedded in the vibrotactile feedback jacket to generate immersive AR experience. These vibration actuators are triggered in accordance with the designed game scripts. Various vibration patterns and intensity levels are synthesized in different game scenes. This article presents the details of the entire software development of the AR serious game, including game scripts, game scenes with AR effects design, signal processing flow, behavior design, and communication configuration. Graphics computations are processed using the graphics processing unit in the system. Results /Conclusions: The performance of the AR serious game prototype is evaluated and analyzed. The computation loads and resource utilization of normal game scenes and heavy computation scenes are compared. With 14 vibration actuators placed at different body positions, various vibration patterns and intensity levels can be generated by the vibrotactile feedback jacket, providing different real-world feedback. The prototype of this AR serious game can be valuable in building large-scale AR or virtual reality educational and entertainment games. Possible future improvements of the proposed prototype are also discussed in this article

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Relative vibrotactile spatial acuity of the torso

    Get PDF
    While tactile acuity for pressure has been extensively investigated, far less is known about acuity for vibrotactile stimulation. Vibrotactile acuity is important however, as such stimulation is used in many applications, including sensory substitution devices. We tested discrimination of vibrotactile stimulation from eccentric rotating mass motors with in-plane vibration. In 3 experiments, we tested gradually decreasing center-to-center (c/c) distances from 30 mm (experiment 1) to 13 mm (experiment 3). Observers judged whether a second vibrating stimulator (‘tactor’) was to the left or right or in the same place as a first one that came on 250 ms before the onset of the second (with a 50-ms inter-stimulus interval). The results show that while accuracy tends to decrease the closer the tactors are, discrimination accuracy is still well above chance for the smallest distance, which places the threshold for vibrotactile stimulation well below 13 mm, which is lower than recent estimates. The results cast new light on vibrotactile sensitivity and can furthermore be of use in the design of devices that convey information through vibrotactile stimulation.Peer Reviewe

    Designing smart garments for rehabilitation

    Get PDF

    Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Get PDF
    As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI) are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human) animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploratio

    A transdisciplinary collaborative journey leading to sensorial clothing

    Get PDF
    Recent science funding initiatives have enabled participants from a diverse array of disciplines to engage in common spaces for developing solutions for new wearables. These initiatives include collaborations between the arts and sciences, fields which have traditionally contributed very different forms of knowledge, methodology, and results. However, many such collaborations often turn out as science communication and dissemination activities that make no concrete contribution to technological innovation. Magic Lining, a transdisciplinary collaborative project involving artistic and scientific partners working in the fields of e-textile design, cognitive neuroscience and human-computer interaction, creates a shared experiential knowledge space. This article focuses on the research question of how a transdisciplinary collaborative design processinvolving material explorations, prototyping, first-person-perspective and user studies, can lead to the creation of a garment that invites various perceptual and emotional responses in its wearer. The article reflects on the design journey, highlighting the transdisciplinary team's research through design experience and shared language for knowledge exchange. This process has revealed new research paths for an emerging field of 'sensorial clothing', combining the various team members' fields of expertise and resulting in a wearable prototype.This work was partially supported by the VERTIGO project as part of the STARTS program of the European Commission, based on technological elements from the project Magic Shoes (grant PSI2016-79004-R, Ministerio de Economía, Industria y Competitividad of Spain, AEI/FEDER). The work was also supported by the project Magic outFIT, funded by the Spanish Agencia Estatal de Investigación (PID2019-105579RB-I00/AEI/10.13039/501100011033). Aleksander Väljamäe’s work was supported by the Estonian Research Council grant PUT1518; and Ana Tajadura-Jiménez’s work was supported by RYC-2014–15421 grant, Ministerio de Economía, Industria y Competitividad of Spain

    Designing for Mixed Reality Urban Exploration

    Get PDF
    This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned.Peer reviewe

    Designing Tactile Interfaces for Abstract Interpersonal Communication, Pedestrian Navigation and Motorcyclists Navigation

    Get PDF
    The tactile medium of communication with users is appropriate for displaying information in situations where auditory and visual mediums are saturated. There are situations where a subject's ability to receive information through either of these channels is severely restricted by the environment they are in or through any physical impairments that the subject may have. In this project, we have focused on two groups of users who need sustained visual and auditory focus in their task: Soldiers on the battle field and motorcyclists. Soldiers on the battle field use their visual and auditory capabilities to maintain awareness of their environment to guard themselves from enemy assault. One of the major challenges to coordination in a hazardous environment is maintaining communication between team members while mitigating cognitive load. Compromise in communication between team members may result in mistakes that can adversely affect the outcome of a mission. We have built two vibrotactile displays, Tactor I and Tactor II, each with nine actuators arranged in a three-by-three matrix with differing contact areas that can represent a total of 511 shapes. We used two dimensions of tactile medium, shapes and waveforms, to represent verb phrases and evaluated ability of users to perceive verb phrases the tactile code. We evaluated the effectiveness of communicating verb phrases while the users were performing two tasks simultaneously. The results showed that performing additional visual task did not affect the accuracy or the time taken to perceive tactile codes. Another challenge in coordinating Soldiers on a battle field is navigating them to respective assembly areas. We have developed HaptiGo, a lightweight haptic vest that provides pedestrians both navigational intelligence and obstacle detection capabilities. HaptiGo consists of optimally-placed vibro-tactile sensors that utilize natural and small form factor interaction cues, thus emulating the sensation of being passively guided towards the intended direction. We evaluated HaptiGo and found that it was able to successfully navigate users with timely alerts of incoming obstacles without increasing cognitive load, thereby increasing their environmental awareness. Additionally, we show that users are able to respond to directional information without training. The needs of motorcyclists are di erent from those of Soldiers. Motorcyclists' need to maintain visual and auditory situational awareness at all times is crucial since they are highly exposed on the road. Route guidance systems, such as the Garmin, have been well tested on automobilists, but remain much less safe for use by motorcyclists. Audio/visual routing systems decrease motorcyclists' situational awareness and vehicle control, and thus increase the chances of an accident. To enable motorcyclists to take advantage of route guidance while maintaining situational awareness, we created HaptiMoto, a wearable haptic route guidance system. HaptiMoto uses tactile signals to encode the distance and direction of approaching turns, thus avoiding interference with audio/visual awareness. Evaluations show that HaptiMoto is intuitive for motorcyclists, and a safer alternative to existing solutions

    Designing for Mixed Reality Urban Exploration

    Get PDF
    This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned
    corecore