135 research outputs found

    Virtual reality for assembly methods prototyping: a review

    Get PDF
    Assembly planning and evaluation is an important component of the product design process in which details about how parts of a new product will be put together are formalized. A well designed assembly process should take into account various factors such as optimum assembly time and sequence, tooling and fixture requirements, ergonomics, operator safety, and accessibility, among others. Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and fixtures and evaluation of clearances and tolerances or use simulated human mannequins to approximate human interaction in the assembly process. Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al. in Proc I MECH E Part B J Eng 213(5):461–474, 1999). This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design. This paper provides a review of the research in virtual assembly and categorizes the different approaches. Finally, critical requirements and directions for future research are presented

    “You, Move There!”: Investigating the Impact of Feedback on Voice Control in Virtual Environments

    Get PDF
    Current virtual environment (VEs) input techniques often overlook speech as a useful control modality. Speech could improve interaction in multimodal VEs by enabling users to address objects, locations, and agents, yet research on how to design effective speech for VEs is limited. Our paper investigates the effect of agent feedback on speech VE experiences. Through a lab study, users commanded agents to navigate a VE, receiving either auditory, visual or behavioural feedback. Based on a post interaction semi-structured interview, we find that the type of feedback given by agents is critical to user experience. Specifically auditory mechanisms are preferred, allowing users to engage with other modalities seamlessly during interaction. Although command-like utterances were frequently used, it was perceived as contextually appropriate, ensuring users were understood. Many also found it difficult to discover speech-based functionality. Drawing on these, we discuss key challenges for designing speech input for VEs

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    Realistic Interaction with Virtual Objects within Arm's Reach

    Get PDF
    The automotive industry requires realistic virtual reality applications more than other domains to increase the efficiency of product development. Currently, the visual quality of virtual invironments resembles reality, but interaction within these environments is usually far from what is known in everyday life. Several realistic research approaches exist, however they are still not all-encompassing enough to be usable in industrial processes. This thesis realizes lifelike direct multi-hand and multi-finger interaction with arbitrary objects, and proposes algorithmic and technical improvements that also approach lifelike usability. In addition, the thesis proposes methods to measure the effectiveness and usability of such interaction techniques as well as discusses different types of grasping feedback that support the user during interaction. Realistic and reliable interaction is reached through the combination of robust grasping heuristics and plausible pseudophysical object reactions. The easy-to-compute grasping rules use the objects’ surface normals, and mimic human grasping behavior. The novel concept of Normal Proxies increases grasping stability and diminishes challenges induced by adverse normals. The intricate act of picking-up thin and tiny objects remains challenging for some users. These cases are further supported by the consideration of finger pinches, which are measured with a specialized finger tracking device. With regard to typical object constraints, realistic object motion is geometrically calculated as a plausible reaction on user input. The resulting direct finger-based interaction technique enables realistic and intuitive manipulation of arbitrary objects. The thesis proposes two methods that prove and compare effectiveness and usability. An expert review indicates that experienced users quickly familiarize themselves with the technique. A quantitative and qualitative user study shows that direct finger-based interaction is preferred over indirect interaction in the context of functional car assessments. While controller-based interaction is more robust, the direct finger-based interaction provides greater realism, and becomes nearly as reliable when the pinch-sensitive mechanism is used. At present, the haptic channel is not used in industrial virtual reality applications. That is why it can be used for grasping feedback which improves the users’ understanding of the grasping situation. This thesis realizes a novel pressure-based tactile feedback at the fingertips. As an alternative, vibro-tactile feedback at the same location is realized as well as visual feedback by the coloring of grasp-involved finger segments. The feedback approaches are also compared within the user study, which reveals that grasping feedback is a requirement to judge grasp status and that tactile feedback improves interaction independent of the used display system. The considerably stronger vibrational tactile feedback can quickly become annoying during interaction. The interaction improvements and hardware enhancements make it possible to interact with virtual objects in a realistic and reliable manner. By addressing realism and reliability, this thesis paves the way for the virtual evaluation of human-object interaction, which is necessary for a broader application of virtual environments in the automotive industry and other domains.StĂ€rker als andere Branchen benötigt die Automobilindustrie realistische Virtual Reality Anwendungen fĂŒr eine effiziente Produktentwicklung. WĂ€hrend sich die visuelle QualitĂ€t virtueller Darstellungen bereits der RealitĂ€t angenĂ€hert hat, ist die Interaktion mit virtuellen Umgebungen noch weit vom tĂ€glichen Erleben der Menschen entfernt. Einige ForschungsansĂ€tze haben sich mit realistischer Interaktion befasst, gehen aber nicht weit genug, um in industriellen Prozessen eingesetzt zu werden. Diese Arbeit realisiert eine lebensnahe mehrhĂ€ndige und fingerbasierte Interaktion mit beliebigen Objekten. Dabei ermöglichen algorithmische und technische Verbesserungen eine realitĂ€tsnahe Usability. Außerdem werden Methoden fĂŒr die Evaluation dieser Interaktionstechnik vorgestellt und benutzerunterstĂŒtzende Greiffeedbackarten diskutiert. Die verlĂ€ssliche und gleichzeitig realistische Interaktion wird durch die Kombination von robusten Greifheuristiken und pseudophysikalischen Objektreaktionen erreicht. Die das menschliche Greifverhalten nachbildenden Greifregeln basieren auf den OberflĂ€chennormalen der Objekte. Die Reduktion negativer EinflĂŒsse verfĂ€lschter Normalen und eine höhere GriffstabilitĂ€t werden durch das neuartige Konzept der Normal Proxies erreicht. Dennoch bleibt fĂŒr manche Nutzer das Aufnehmen von dĂŒnnen und kleinen Objekten problematisch. Diese FĂ€lle werden zusĂ€tzlich durch die Einbeziehung von FingerberĂŒhrungen unterstĂŒtzt, die mit einem speziellen Fingertracking GerĂ€t erfasst werden. Plausible Objektreaktionen auf Benutzereingaben werden unter BerĂŒcksichtigung typischer ObjekteinschrĂ€nkungen geometrisch berechnet. Die Arbeit schlĂ€gt zwei Methoden zur Evaluierung der fingerbasierten Interaktion vor. Ein Expertenreview zeigt, dass sich erfahrene Benutzer sehr schnell in die Technik einfinden. In einer Benutzerstudie wird nachgewiesen, dass fingerbasierte Interaktion im hier untersuchten Kontext vor indirekter Interaktion mit einem EingabegerĂ€t bevorzugt wird. WĂ€hrend letztere robuster zu handhaben ist, stellt die fingerbasierte Interaktion einen deutlich höheren Realismus bereit und erreicht mit den vorgeschlagenen Verbesserungen eine vergleichbare VerlĂ€sslichkeit. Um Greifsituationen transparent zu gestalten, realisiert diese Arbeit ein neuartiges druckbasiertes taktiles Feedback an den Fingerspitzen. Alternativ wird ein vibrotaktiles Feedback am gleichen Ort realisiert und visuelles Feedback durch die EinfĂ€rbung der griffbeteiligten Fingersegmente umgesetzt. Die verschiedenen FeedbackansĂ€tze werden in der Benutzerstudie verglichen. Dabei wird Greiffeedback als Voraussetzung identifiziert, um den Greifzustand zu beurteilen. Taktiles Feedback verbessert dabei die Interaktion unabhĂ€ngig vom eingesetzten Display. Das merklich stĂ€rkere Vibrationsfeedback kann wĂ€hrend der Interaktion störend wirken. Die vorgestellten Interaktionsverbesserungen und Hardwareerweiterungen ermöglichen es, mit virtuellen Objekten auf realistische und zuverlĂ€ssige Art zu interagieren. Indem die Arbeit Realismus und VerlĂ€sslichkeit gleichzeitig adressiert, bereitet sie den Boden fĂŒr die virtuelle Untersuchung von Mensch-Objekt Interaktionen und ermöglicht so einen breiteren Einsatz virtueller Techniken in der Automobilindustrie und in anderen Bereichen

    Realistic Interaction with Virtual Objects within Arm's Reach

    Get PDF
    The automotive industry requires realistic virtual reality applications more than other domains to increase the efficiency of product development. Currently, the visual quality of virtual invironments resembles reality, but interaction within these environments is usually far from what is known in everyday life. Several realistic research approaches exist, however they are still not all-encompassing enough to be usable in industrial processes. This thesis realizes lifelike direct multi-hand and multi-finger interaction with arbitrary objects, and proposes algorithmic and technical improvements that also approach lifelike usability. In addition, the thesis proposes methods to measure the effectiveness and usability of such interaction techniques as well as discusses different types of grasping feedback that support the user during interaction. Realistic and reliable interaction is reached through the combination of robust grasping heuristics and plausible pseudophysical object reactions. The easy-to-compute grasping rules use the objects’ surface normals, and mimic human grasping behavior. The novel concept of Normal Proxies increases grasping stability and diminishes challenges induced by adverse normals. The intricate act of picking-up thin and tiny objects remains challenging for some users. These cases are further supported by the consideration of finger pinches, which are measured with a specialized finger tracking device. With regard to typical object constraints, realistic object motion is geometrically calculated as a plausible reaction on user input. The resulting direct finger-based interaction technique enables realistic and intuitive manipulation of arbitrary objects. The thesis proposes two methods that prove and compare effectiveness and usability. An expert review indicates that experienced users quickly familiarize themselves with the technique. A quantitative and qualitative user study shows that direct finger-based interaction is preferred over indirect interaction in the context of functional car assessments. While controller-based interaction is more robust, the direct finger-based interaction provides greater realism, and becomes nearly as reliable when the pinch-sensitive mechanism is used. At present, the haptic channel is not used in industrial virtual reality applications. That is why it can be used for grasping feedback which improves the users’ understanding of the grasping situation. This thesis realizes a novel pressure-based tactile feedback at the fingertips. As an alternative, vibro-tactile feedback at the same location is realized as well as visual feedback by the coloring of grasp-involved finger segments. The feedback approaches are also compared within the user study, which reveals that grasping feedback is a requirement to judge grasp status and that tactile feedback improves interaction independent of the used display system. The considerably stronger vibrational tactile feedback can quickly become annoying during interaction. The interaction improvements and hardware enhancements make it possible to interact with virtual objects in a realistic and reliable manner. By addressing realism and reliability, this thesis paves the way for the virtual evaluation of human-object interaction, which is necessary for a broader application of virtual environments in the automotive industry and other domains.StĂ€rker als andere Branchen benötigt die Automobilindustrie realistische Virtual Reality Anwendungen fĂŒr eine effiziente Produktentwicklung. WĂ€hrend sich die visuelle QualitĂ€t virtueller Darstellungen bereits der RealitĂ€t angenĂ€hert hat, ist die Interaktion mit virtuellen Umgebungen noch weit vom tĂ€glichen Erleben der Menschen entfernt. Einige ForschungsansĂ€tze haben sich mit realistischer Interaktion befasst, gehen aber nicht weit genug, um in industriellen Prozessen eingesetzt zu werden. Diese Arbeit realisiert eine lebensnahe mehrhĂ€ndige und fingerbasierte Interaktion mit beliebigen Objekten. Dabei ermöglichen algorithmische und technische Verbesserungen eine realitĂ€tsnahe Usability. Außerdem werden Methoden fĂŒr die Evaluation dieser Interaktionstechnik vorgestellt und benutzerunterstĂŒtzende Greiffeedbackarten diskutiert. Die verlĂ€ssliche und gleichzeitig realistische Interaktion wird durch die Kombination von robusten Greifheuristiken und pseudophysikalischen Objektreaktionen erreicht. Die das menschliche Greifverhalten nachbildenden Greifregeln basieren auf den OberflĂ€chennormalen der Objekte. Die Reduktion negativer EinflĂŒsse verfĂ€lschter Normalen und eine höhere GriffstabilitĂ€t werden durch das neuartige Konzept der Normal Proxies erreicht. Dennoch bleibt fĂŒr manche Nutzer das Aufnehmen von dĂŒnnen und kleinen Objekten problematisch. Diese FĂ€lle werden zusĂ€tzlich durch die Einbeziehung von FingerberĂŒhrungen unterstĂŒtzt, die mit einem speziellen Fingertracking GerĂ€t erfasst werden. Plausible Objektreaktionen auf Benutzereingaben werden unter BerĂŒcksichtigung typischer ObjekteinschrĂ€nkungen geometrisch berechnet. Die Arbeit schlĂ€gt zwei Methoden zur Evaluierung der fingerbasierten Interaktion vor. Ein Expertenreview zeigt, dass sich erfahrene Benutzer sehr schnell in die Technik einfinden. In einer Benutzerstudie wird nachgewiesen, dass fingerbasierte Interaktion im hier untersuchten Kontext vor indirekter Interaktion mit einem EingabegerĂ€t bevorzugt wird. WĂ€hrend letztere robuster zu handhaben ist, stellt die fingerbasierte Interaktion einen deutlich höheren Realismus bereit und erreicht mit den vorgeschlagenen Verbesserungen eine vergleichbare VerlĂ€sslichkeit. Um Greifsituationen transparent zu gestalten, realisiert diese Arbeit ein neuartiges druckbasiertes taktiles Feedback an den Fingerspitzen. Alternativ wird ein vibrotaktiles Feedback am gleichen Ort realisiert und visuelles Feedback durch die EinfĂ€rbung der griffbeteiligten Fingersegmente umgesetzt. Die verschiedenen FeedbackansĂ€tze werden in der Benutzerstudie verglichen. Dabei wird Greiffeedback als Voraussetzung identifiziert, um den Greifzustand zu beurteilen. Taktiles Feedback verbessert dabei die Interaktion unabhĂ€ngig vom eingesetzten Display. Das merklich stĂ€rkere Vibrationsfeedback kann wĂ€hrend der Interaktion störend wirken. Die vorgestellten Interaktionsverbesserungen und Hardwareerweiterungen ermöglichen es, mit virtuellen Objekten auf realistische und zuverlĂ€ssige Art zu interagieren. Indem die Arbeit Realismus und VerlĂ€sslichkeit gleichzeitig adressiert, bereitet sie den Boden fĂŒr die virtuelle Untersuchung von Mensch-Objekt Interaktionen und ermöglicht so einen breiteren Einsatz virtueller Techniken in der Automobilindustrie und in anderen Bereichen

    Combining physical constraints with geometric constraint-based modeling for virtual assembly

    Get PDF
    The research presented in this dissertation aims to create a virtual assembly environment capable of simulating the constant and subtle interactions (hand-part, part-part) that occur during manual assembly, and providing appropriate feedback to the user in real-time. A virtual assembly system called SHARP System for Haptic Assembly and Realistic Prototyping is created, which utilizes simulated physical constraints for part placement during assembly.;The first approach taken in this research attempt utilized Voxmap Point Shell (VPS) software for implementing collision detection and physics-based modeling in SHARP. A volumetric approach, where complex CAD models were represented by numerous small cubic-voxel elements was used to obtain fast physics update rates (500--1000 Hz). A novel dual-handed haptic interface was developed and integrated into the system allowing the user to simultaneously manipulate parts with both hands. However, coarse model approximations used for collision detection and physics-based modeling only allowed assembly when minimum clearance was limited to ∌8-10%.;To provide a solution to the low clearance assembly problem, the second effort focused on importing accurate parametric CAD data (B-Rep) models into SHARP. These accurate B-Rep representations are used for collision detection as well as for simulating physical contacts more accurately. A new hybrid approach is presented, which combines the simulated physical constraints with geometric constraints which can be defined at runtime. Different case studies are used to identify the suitable combination of methods (collision detection, physical constraints, geometric constraints) capable of best simulating intricate interactions and environment behavior during manual assembly. An innovative automatic constraint recognition algorithm is created and integrated into SHARP. The feature-based approach utilized for the algorithm design, facilitates faster identification of potential geometric constraints that need to be defined. This approach results in optimized system performance while providing a more natural user experience for assembly

    Development of actuated Tangible User Interfaces: new interaction concepts and evaluation methods

    Get PDF
    Riedenklau E. Development of actuated Tangible User Interfaces: new interaction concepts and evaluation methods. Bielefeld: UniversitĂ€t Bielefeld; 2016.Making information understandable and literally graspable is the main goal of tangible interaction research. By giving digital data physical representations (Tangible User Interface Objects, or TUIOs), they can be used and manipulated like everyday objects with the users’ natural manipulation skills. Such physical interaction is basically of uni-directional kind, directed from the user to the system, limiting the possible interaction patterns. In other words, the system has no means to actively support the physical interaction. Within the frame of tabletop tangible user interfaces, this problem was addressed by the introduction of actuated TUIOs, that are controllable by the system. Within the frame of this thesis, we present the development of our own actuated TUIOs and address multiple interaction concepts we identified as research gaps in literature on actuated Tangible User Interfaces (TUIs). Gestural interaction is a natural means for humans to non-verbally communicate using their hands. TUIs should be able to support gestural interaction, since our hands are already heavily involved in the interaction. This has rarely been investigated in literature. For a tangible social network client application, we investigate two methods for collecting user-defined gestures that our system should be able to interpret for triggering actions. Versatile systems often understand a wide palette of commands. Another approach for triggering actions is the use of menus. We explore the design space of menu metaphors used in TUIs and present our own actuated dial-based approach. Rich interaction modalities may support the understandability of the represented data and make the interaction with them more appealing, but also mean high demands on real-time precessing. We highlight new research directions for integrated feature rich and multi-modal interaction, such as graphical display, sound output, tactile feedback, our actuated menu and automatically maintained relations between actuated TUIOs within a remote collaboration application. We also tackle the introduction of further sophisticated measures for the evaluation of TUIs to provide further evidence to the theories on tangible interaction. We tested our enhanced measures within a comparative study. Since one of the key factors in effective manual interaction is speed, we benchmarked both the human hand’s manipulation speed and compare it with the capabilities of our own implementation of actuated TUIOs and the systems described in literature. After briefly discussing applications that lie beyond the scope of this thesis, we conclude with a collection of design guidelines gathered in the course of this work and integrate them together with our findings into a larger frame

    A Haptic Feedback System for Lower Limb Amputees Based on Gait Event Detection

    Get PDF
    Lower limb amputation has significant effects on a person’s quality of life and ability to perform activities of daily living. Prescription of prosthetic device post amputation aims to help restore some degrees of mobility function, however studies have shown evidence of low balance confidence and higher risk of falling among amputee community, especially those suffering from above knee amputation. While advanced prostheses offer better control, they often lack a form of feedback that delivers the awareness of the limb position to the prosthetic user while walking. This research presents the development and evaluation of a wearable skinstretch haptic feedback system intended to deliver cues of two crucial gait events, namely the Initial Contact (IC) and Toe-off (TO) to its wearer. The system comprises a haptic module that applies lateral skin-stretch on the upper leg or the trunk, corresponding to the gait event detection module based on Inertial Measurement Unit (IMU) attached at the shank. The design and development iterations of the haptic module is presented, and characterization of the feedback parameters is discussed. The validation of the gait event detection module is carried out and finally the integration of the haptic feedback system is described. Experimental work with healthy subjects and an amputee indicated good perceptibility of the feedback during static and dynamic (walking) condition, although higher magnitude of stretch was required to perceive the feedback during dynamic condition. User response time during dynamic activity showed that the haptic feedback system is suitable for delivering cues of IC and TO within the duration of the stance phase. In addition, feedback delivered in discernible patterns can be learned and adapted by the subjects. Finally, a case study was carried out with an above-knee amputee to assess the effects of the haptic feedback on spatio-temporal gait parameters and on the vertical ground reaction force during treadmill and overground walking. The research presented in this report introduces a novel design of a haptic feedback device. As such, the outcome includes a well-controlled skin-stretch effect which contributes to the research by investigating skin-stretch feedback for conveying discrete event information rather than conveying direction information as presented in other studies. In addition, it is found that stretch magnitude as small as 3 mm could be perceived in short duration of 150 ms during dynamic condition, making it a suitable alternative to other widely investigated haptic modality such as vibration for ambulatory feedback application. With continuous training, the haptic feedback system could possibly benefit lower limb amputees by creating awareness of the limb placement during ambulation, potentially reducing visual dependency and increasing walking confidence

    A reconfigurable tactile display based on polymer MEMS technology

    Get PDF
    This research focuses on the development of polymer microfabrication technologies for the realization of two major components of a pneumatic tactile display: a microactuator array and a complementary microvalve (control) array. The concept, fabrication, and characterization of a kinematically-stabilized polymeric microbubble actuator (¥°endoskeletal microbubble actuator¥±) were presented. A systematic design and modeling procedure was carried out to generate an optimized geometry of the corrugated diaphragm to satisfy membrane deflection, force, and stability requirements set forth by the tactile display goals. A refreshable Braille cell as a tactile display prototype has been developed based on a 2x3 endoskeletal microbubble array and an array of commercial valves. The prototype can provide both a static display (which meets the displacement and force requirement of a Braille display) and vibratory tactile sensations. Along with the above capabilities, the device was designed to meet the criteria of lightness and compactness to permit portable operation. The design is scalable with respect to the number of tactile actuators while still being simple to fabricate. In order to further reduce the size and cost of the tactile display, a microvalve array can be integrated into the tactile display system to control the pneumatic fluid that actuates the microbubble actuator. A piezoelectrically-driven and hydraulically-amplified polymer microvalve has been designed, fabricated, and tested. An incompressible elastomer was used as a solid hydraulic medium to convert the small axial displacement of a piezoelectric actuator into a large valve head stroke while maintaining a large blocking force. The function of the microvalve as an on-off switch for a pneumatic microbubble tactile actuator was demonstrated. To further reduce the cost of the microvalve, a laterally-stacked multilayer PZT actuator has been fabricated using diced PZT multilayer, high aspect ratio SU-8 photolithography, and molding of electrically conductive polymer composite electrodes.Ph.D.Committee Chair: Allen,Mark; Committee Member: Bucknall,David; Committee Member: Book,Wayne; Committee Member: Griffin,Anselm; Committee Member: Yao,Donggan

    Touch- and Walkable Virtual Reality to Support Blind and Visually Impaired Peoples‘ Building Exploration in the Context of Orientation and Mobility

    Get PDF
    Der Zugang zu digitalen Inhalten und Informationen wird immer wichtiger fĂŒr eine erfolgreiche Teilnahme an der heutigen, zunehmend digitalisierten Zivilgesellschaft. Solche Informationen werden meist visuell prĂ€sentiert, was den Zugang fĂŒr blinde und sehbehinderte Menschen einschrĂ€nkt. Die grundlegendste Barriere ist oft die elementare Orientierung und MobilitĂ€t (und folglich die soziale MobilitĂ€t), einschließlich der Erlangung von Kenntnissen ĂŒber unbekannte GebĂ€ude vor deren Besuch. Um solche Barrieren zu ĂŒberbrĂŒcken, sollten technische Hilfsmittel entwickelt und eingesetzt werden. Es ist ein Kompromiss zwischen technologisch niedrigschwellig zugĂ€nglichen und verbreitbaren Hilfsmitteln und interaktiv-adaptiven, aber komplexen Systemen erforderlich. Die Anpassung der Technologie der virtuellen RealitĂ€t (VR) umfasst ein breites Spektrum an Entwicklungs- und Entscheidungsoptionen. Die Hauptvorteile der VR-Technologie sind die erhöhte InteraktivitĂ€t, die Aktualisierbarkeit und die Möglichkeit, virtuelle RĂ€ume und Modelle als Abbilder von realen RĂ€umen zu erkunden, ohne dass reale Gefahren und die begrenzte VerfĂŒgbarkeit von sehenden Helfern auftreten. Virtuelle Objekte und Umgebungen haben jedoch keine physische Beschaffenheit. Ziel dieser Arbeit ist es daher zu erforschen, welche VR-Interaktionsformen sinnvoll sind (d.h. ein angemessenes Verbreitungspotenzial bieten), um virtuelle ReprĂ€sentationen realer GebĂ€ude im Kontext von Orientierung und MobilitĂ€t berĂŒhrbar oder begehbar zu machen. Obwohl es bereits inhaltlich und technisch disjunkte Entwicklungen und Evaluationen zur VR-Technologie gibt, fehlt es an empirischer Evidenz. ZusĂ€tzlich bietet diese Arbeit einen Überblick ĂŒber die verschiedenen Interaktionen. Nach einer Betrachtung der menschlichen Physiologie, Hilfsmittel (z.B. taktile Karten) und technologischen Eigenschaften wird der aktuelle Stand der Technik von VR vorgestellt und die Anwendung fĂŒr blinde und sehbehinderte Nutzer und der Weg dorthin durch die EinfĂŒhrung einer neuartigen Taxonomie diskutiert. Neben der Interaktion selbst werden Merkmale des Nutzers und des GerĂ€ts, der Anwendungskontext oder die nutzerzentrierte Entwicklung bzw. Evaluation als Klassifikatoren herangezogen. BegrĂŒndet und motiviert werden die folgenden Kapitel durch explorative AnsĂ€tze, d.h. im Bereich 'small scale' (mit sogenannten Datenhandschuhen) und im Bereich 'large scale' (mit einer avatargesteuerten VR-Fortbewegung). Die folgenden Kapitel fĂŒhren empirische Studien mit blinden und sehbehinderten Nutzern durch und geben einen formativen Einblick, wie virtuelle Objekte in Reichweite der HĂ€nde mit haptischem Feedback erfasst werden können und wie verschiedene Arten der VR-Fortbewegung zur Erkundung virtueller Umgebungen eingesetzt werden können. Daraus werden gerĂ€teunabhĂ€ngige technologische Möglichkeiten und auch Herausforderungen fĂŒr weitere Verbesserungen abgeleitet. Auf der Grundlage dieser Erkenntnisse kann sich die weitere Forschung auf Aspekte wie die spezifische Gestaltung interaktiver Elemente, zeitlich und rĂ€umlich kollaborative Anwendungsszenarien und die Evaluation eines gesamten Anwendungsworkflows (d.h. Scannen der realen Umgebung und virtuelle Erkundung zu Trainingszwecken sowie die Gestaltung der gesamten Anwendung in einer langfristig barrierefreien Weise) konzentrieren.Access to digital content and information is becoming increasingly important for successful participation in today's increasingly digitized civil society. Such information is mostly presented visually, which restricts access for blind and visually impaired people. The most fundamental barrier is often basic orientation and mobility (and consequently, social mobility), including gaining knowledge about unknown buildings before visiting them. To bridge such barriers, technological aids should be developed and deployed. A trade-off is needed between technologically low-threshold accessible and disseminable aids and interactive-adaptive but complex systems. The adaptation of virtual reality (VR) technology spans a wide range of development and decision options. The main benefits of VR technology are increased interactivity, updatability, and the possibility to explore virtual spaces as proxies of real ones without real-world hazards and the limited availability of sighted assistants. However, virtual objects and environments have no physicality. Therefore, this thesis aims to research which VR interaction forms are reasonable (i.e., offering a reasonable dissemination potential) to make virtual representations of real buildings touchable or walkable in the context of orientation and mobility. Although there are already content and technology disjunctive developments and evaluations on VR technology, there is a lack of empirical evidence. Additionally, this thesis provides a survey between different interactions. Having considered the human physiology, assistive media (e.g., tactile maps), and technological characteristics, the current state of the art of VR is introduced, and the application for blind and visually impaired users and the way to get there is discussed by introducing a novel taxonomy. In addition to the interaction itself, characteristics of the user and the device, the application context, or the user-centered development respectively evaluation are used as classifiers. Thus, the following chapters are justified and motivated by explorative approaches, i.e., in the group of 'small scale' (using so-called data gloves) and in the scale of 'large scale' (using an avatar-controlled VR locomotion) approaches. The following chapters conduct empirical studies with blind and visually impaired users and give formative insight into how virtual objects within hands' reach can be grasped using haptic feedback and how different kinds of VR locomotion implementation can be applied to explore virtual environments. Thus, device-independent technological possibilities and also challenges for further improvements are derived. On the basis of this knowledge, subsequent research can be focused on aspects such as the specific design of interactive elements, temporally and spatially collaborative application scenarios, and the evaluation of an entire application workflow (i.e., scanning the real environment and exploring it virtually for training purposes, as well as designing the entire application in a long-term accessible manner)
    • 

    corecore