3,562 research outputs found

    Integrating virtual reality and augmented reality in a collaborative user interface

    Get PDF
    Application that adopts collaborative system allows multiple users to interact with other users in the same virtual space either in Virtual Reality (VR) or Augmented Reality (AR). This paper aims to integrate the VR and AR space in a Collaborative User Interface that enables the user to cooperate with other users in a different type of interfaces in a single shared space manner. The gesture interaction technique is proposed as the interaction tool in both of the virtual spaces as it can provide a more natural gesture interaction when interacting with the virtual object. The integration of VR and AR space provide a cross-discipline shared data interchange through the network protocol of client-server architecture

    Context-Aware Mobile Augmented Reality Visualization in Construction Engineering Education

    Get PDF
    Recent studies suggest that the number of students pursuing science, technology, engineering, and mathematics (STEM) degrees has been generally decreasing. An extensive body of research cites the lack of motivation and engagement in the learning process as a major underlying reason of this decline. It has been discussed that if properly implemented, instructional technology can enhance student engagement and the quality of learning. Therefore, the main goal of this research is to implement and assess effectiveness of augmented reality (AR)-based pedagogical tools on student learning. For this purpose, two sets of experiments were designed and implemented in two different construction and civil engineering undergraduate level courses at the University of Central Florida (UCF). The first experiment was designed to systematically assess the effectiveness of a context-aware mobile AR tool (CAM-ART) in real classroom-scale environment. This tool was used to enhance traditional lecture-based instruction and information delivery by augmenting the contents of an ordinary textbook using computer-generated three-dimensional (3D) objects and other virtual multimedia (e.g. sound, video, graphs). The experiment conducted on two separate control and test groups and pre- and post- performance data as well as student perception of using CAM-ART was collected through several feedback questionnaires. In the second experiment, a building design and assembly task competition was designed and conducted using a mobile AR platform. The pedagogical value of mobile AR-based instruction and information delivery to student learning in a large-scale classroom setting was also assessed and investigated. Similar to the first experiment, students in this experiment were divided into two control and test groups. Students\u27 performance data as well as their feedback, suggestions, and workload were systematically collected and analyzed. Data analysis showed that the mobile AR framework had a measurable and positive impact on students\u27 learning. In particular, it was found that students in the test group (who used the AR tool) performed slightly better with respect to certain measures and spent more time on collaboration, communication, and exchanging ideas in both experiments. Overall, students ranked the effectiveness of the AR tool very high and stated that it has a good potential to reform traditional teaching methods

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Evaluating the development of wearable devices, personal data assistants and the use of other mobile devices in further and higher education institutions

    Get PDF
    This report presents technical evaluation and case studies of the use of wearable and mobile computing mobile devices in further and higher education. The first section provides technical evaluation of the current state of the art in wearable and mobile technologies and reviews several innovative wearable products that have been developed in recent years. The second section examines three scenarios for further and higher education where wearable and mobile devices are currently being used. The three scenarios include: (i) the delivery of lectures over mobile devices, (ii) the augmentation of the physical campus with a virtual and mobile component, and (iii) the use of PDAs and mobile devices in field studies. The first scenario explores the use of web lectures including an evaluation of IBM's Web Lecture Services and 3Com's learning assistant. The second scenario explores models for a campus without walls evaluating the Handsprings to Learning projects at East Carolina University and ActiveCampus at the University of California San Diego . The third scenario explores the use of wearable and mobile devices for field trips examining San Francisco Exploratorium's tool for capturing museum visits and the Cybertracker field computer. The third section of the report explores the uses and purposes for wearable and mobile devices in tertiary education, identifying key trends and issues to be considered when piloting the use of these devices in educational contexts

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, fĂŒhren zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhĂ€rent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natĂŒrliche Interaktionstechniken als hilfreich fĂŒr die Datenanalyse erwiesen. DarĂŒber hinaus spielt in solchen AnwendungsfĂ€llen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext fĂŒr die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung gefĂŒhrt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion fĂŒr diese oft komplexen Systeme. In meiner Dissertation beschĂ€ftige ich mich mit dieser Herausforderung, indem ich die Interaktion fĂŒr immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von rĂ€umlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann rĂ€umliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen fĂŒr immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. FĂŒr die zweite Frage untersuche ich, wie insbesondere die rĂ€umliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit rĂ€umlichen GerĂ€ten im Vergleich zur Touch-Eingabe, die Verwendung zusĂ€tzlicher mobiler GerĂ€te als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darĂŒber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie rĂ€umliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstĂŒtzen können
    • 

    corecore