668 research outputs found

    Vibrotactile and Force Collaboration within 3D Virtual Environments

    Get PDF
    In a three-dimensional (3D) virtual environment (VE), proper collaboration between vibrotactile and force cues - two cues of the haptic modality - is important to facilitate task performance of human users. Many studies report that collaborations between multi-sensory cues follow maximum likelihood estimation (MLE). However, an existing work finds that MLE yields a mean and an amplitude mismatches when interpreting the collaboration between the vibrotactile and force cues. We thus proposed mean-shifted MLE and conducted a human study to investigate the mismatches. For the study, we created a VE to replicate the visual scene, the 3D interactive task, and the cues from the existing work. Our participants were biased to rely on the vibrotactile cue for their tasks, departing from unbiased reliance on both cues in the existing work. Assessments of task completion time and task accuracy validated the replication. We found that based on task accuracy MLE explained the cue collaboration to certain degrees, agreed with the existing work. Mean-shifted MLE remedied the mean mismatch, but maintained the amplitude mismatch. Further examinations revealed that the collaboration between both cues may not be entirely additive. This sheds an insight for proper modeling of the collaboration between the vibrotactile and force cues to aid interactive tasks in VEs

    Two-handed navigation in a haptic virtual environment

    Get PDF
    This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues

    Touch- and Walkable Virtual Reality to Support Blind and Visually Impaired Peoples‘ Building Exploration in the Context of Orientation and Mobility

    Get PDF
    Der Zugang zu digitalen Inhalten und Informationen wird immer wichtiger fĂŒr eine erfolgreiche Teilnahme an der heutigen, zunehmend digitalisierten Zivilgesellschaft. Solche Informationen werden meist visuell prĂ€sentiert, was den Zugang fĂŒr blinde und sehbehinderte Menschen einschrĂ€nkt. Die grundlegendste Barriere ist oft die elementare Orientierung und MobilitĂ€t (und folglich die soziale MobilitĂ€t), einschließlich der Erlangung von Kenntnissen ĂŒber unbekannte GebĂ€ude vor deren Besuch. Um solche Barrieren zu ĂŒberbrĂŒcken, sollten technische Hilfsmittel entwickelt und eingesetzt werden. Es ist ein Kompromiss zwischen technologisch niedrigschwellig zugĂ€nglichen und verbreitbaren Hilfsmitteln und interaktiv-adaptiven, aber komplexen Systemen erforderlich. Die Anpassung der Technologie der virtuellen RealitĂ€t (VR) umfasst ein breites Spektrum an Entwicklungs- und Entscheidungsoptionen. Die Hauptvorteile der VR-Technologie sind die erhöhte InteraktivitĂ€t, die Aktualisierbarkeit und die Möglichkeit, virtuelle RĂ€ume und Modelle als Abbilder von realen RĂ€umen zu erkunden, ohne dass reale Gefahren und die begrenzte VerfĂŒgbarkeit von sehenden Helfern auftreten. Virtuelle Objekte und Umgebungen haben jedoch keine physische Beschaffenheit. Ziel dieser Arbeit ist es daher zu erforschen, welche VR-Interaktionsformen sinnvoll sind (d.h. ein angemessenes Verbreitungspotenzial bieten), um virtuelle ReprĂ€sentationen realer GebĂ€ude im Kontext von Orientierung und MobilitĂ€t berĂŒhrbar oder begehbar zu machen. Obwohl es bereits inhaltlich und technisch disjunkte Entwicklungen und Evaluationen zur VR-Technologie gibt, fehlt es an empirischer Evidenz. ZusĂ€tzlich bietet diese Arbeit einen Überblick ĂŒber die verschiedenen Interaktionen. Nach einer Betrachtung der menschlichen Physiologie, Hilfsmittel (z.B. taktile Karten) und technologischen Eigenschaften wird der aktuelle Stand der Technik von VR vorgestellt und die Anwendung fĂŒr blinde und sehbehinderte Nutzer und der Weg dorthin durch die EinfĂŒhrung einer neuartigen Taxonomie diskutiert. Neben der Interaktion selbst werden Merkmale des Nutzers und des GerĂ€ts, der Anwendungskontext oder die nutzerzentrierte Entwicklung bzw. Evaluation als Klassifikatoren herangezogen. BegrĂŒndet und motiviert werden die folgenden Kapitel durch explorative AnsĂ€tze, d.h. im Bereich 'small scale' (mit sogenannten Datenhandschuhen) und im Bereich 'large scale' (mit einer avatargesteuerten VR-Fortbewegung). Die folgenden Kapitel fĂŒhren empirische Studien mit blinden und sehbehinderten Nutzern durch und geben einen formativen Einblick, wie virtuelle Objekte in Reichweite der HĂ€nde mit haptischem Feedback erfasst werden können und wie verschiedene Arten der VR-Fortbewegung zur Erkundung virtueller Umgebungen eingesetzt werden können. Daraus werden gerĂ€teunabhĂ€ngige technologische Möglichkeiten und auch Herausforderungen fĂŒr weitere Verbesserungen abgeleitet. Auf der Grundlage dieser Erkenntnisse kann sich die weitere Forschung auf Aspekte wie die spezifische Gestaltung interaktiver Elemente, zeitlich und rĂ€umlich kollaborative Anwendungsszenarien und die Evaluation eines gesamten Anwendungsworkflows (d.h. Scannen der realen Umgebung und virtuelle Erkundung zu Trainingszwecken sowie die Gestaltung der gesamten Anwendung in einer langfristig barrierefreien Weise) konzentrieren.Access to digital content and information is becoming increasingly important for successful participation in today's increasingly digitized civil society. Such information is mostly presented visually, which restricts access for blind and visually impaired people. The most fundamental barrier is often basic orientation and mobility (and consequently, social mobility), including gaining knowledge about unknown buildings before visiting them. To bridge such barriers, technological aids should be developed and deployed. A trade-off is needed between technologically low-threshold accessible and disseminable aids and interactive-adaptive but complex systems. The adaptation of virtual reality (VR) technology spans a wide range of development and decision options. The main benefits of VR technology are increased interactivity, updatability, and the possibility to explore virtual spaces as proxies of real ones without real-world hazards and the limited availability of sighted assistants. However, virtual objects and environments have no physicality. Therefore, this thesis aims to research which VR interaction forms are reasonable (i.e., offering a reasonable dissemination potential) to make virtual representations of real buildings touchable or walkable in the context of orientation and mobility. Although there are already content and technology disjunctive developments and evaluations on VR technology, there is a lack of empirical evidence. Additionally, this thesis provides a survey between different interactions. Having considered the human physiology, assistive media (e.g., tactile maps), and technological characteristics, the current state of the art of VR is introduced, and the application for blind and visually impaired users and the way to get there is discussed by introducing a novel taxonomy. In addition to the interaction itself, characteristics of the user and the device, the application context, or the user-centered development respectively evaluation are used as classifiers. Thus, the following chapters are justified and motivated by explorative approaches, i.e., in the group of 'small scale' (using so-called data gloves) and in the scale of 'large scale' (using an avatar-controlled VR locomotion) approaches. The following chapters conduct empirical studies with blind and visually impaired users and give formative insight into how virtual objects within hands' reach can be grasped using haptic feedback and how different kinds of VR locomotion implementation can be applied to explore virtual environments. Thus, device-independent technological possibilities and also challenges for further improvements are derived. On the basis of this knowledge, subsequent research can be focused on aspects such as the specific design of interactive elements, temporally and spatially collaborative application scenarios, and the evaluation of an entire application workflow (i.e., scanning the real environment and exploring it virtually for training purposes, as well as designing the entire application in a long-term accessible manner)

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die InformationsĂŒberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den SinneskanĂ€len aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschĂ€tzen, wenn mehrere AktivitĂ€ten gleichzeitig ablaufen. Zum Beispiel können Menschen das GerĂ€usch eines herannahenden Autos ĂŒberhören, wenn sie ĂŒber die Straße gehen und dabei auf ihr Smartphone schauen. Einige SinneskanĂ€le können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeintrĂ€chtigt sein. Unter den SinneskanĂ€len wird BerĂŒhrung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren BerĂŒhrungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen fĂŒr die Navigation, FĂŒhrung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile BenutzeroberflĂ€che um den Kopf herum prĂ€sentiert, um einen möglicherweise beeintrĂ€chtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut fĂŒr taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphĂ€rische OberflĂ€che bietet, die die Darstellung prĂ€ziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natĂŒrlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen AnwendungsfĂ€llen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgefĂŒhrt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der FĂ€higkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) AnsĂ€tzen zur Gestaltung taktiler Muster fĂŒr komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und PrĂ€senzgrades von Virtual-Reality-Anwendungen und (e) der UnterstĂŒtzung von Menschen mit Sehbehinderungen bei der FĂŒhrung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusĂ€tzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine GrĂ¶ĂŸenordnung prĂ€ziser ist als der bisherige Stand der Technik, der einen taktilen GĂŒrtel als Feedback-ModalitĂ€t verwendete. Die FĂ€higkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die LebensqualitĂ€t vieler Menschen mit Sehbehinderungen verbessern

    16th Sound and Music Computing Conference SMC 2019 (28–31 May 2019, Malaga, Spain)

    Get PDF
    The 16th Sound and Music Computing Conference (SMC 2019) took place in Malaga, Spain, 28-31 May 2019 and it was organized by the Application of Information and Communication Technologies Research group (ATIC) of the University of Malaga (UMA). The SMC 2019 associated Summer School took place 25-28 May 2019. The First International Day of Women in Inclusive Engineering, Sound and Music Computing Research (WiSMC 2019) took place on 28 May 2019. The SMC 2019 TOPICS OF INTEREST included a wide selection of topics related to acoustics, psychoacoustics, music, technology for music, audio analysis, musicology, sonification, music games, machine learning, serious games, immersive audio, sound synthesis, etc

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Towards EEG-Based Haptic Interaction within Virtual Environments

    Get PDF
    Current virtual environments (VE) enable perceiving haptic stimuli to facilitate 3D user interaction, but lack brain-interfacial contents. Using electroencephalography (EEG), we undertook a feasibility study on exploring event-related potential (ERP) patterns of the user's brain responses during haptic interaction within a VE. The interaction was flying a virtual drone along a curved transmission line to detect defects under the stimuli (e.g., force increase and/or vibrotactile cues). We found that there were variations in the peak amplitudes and latencies (as ERP patterns) of the responses at about 200 ms post the onset of the stimuli. The largest negative peak occurred during 200~400 ms after the onset in all vibration-related blocks. Moreover, the amplitudes and latencies of the peak were differentiable among the vibration-related blocks. These findings imply feasible decoding of the brain responses during haptic interaction within VEs
    • 

    corecore