366 research outputs found

    Effects of Augmented Reality on Student Achievement and Self-Efficacy in Vocational Education and Training

    Get PDF
    This study aimed to test the impact of augmented reality (AR) use on student achievement and self-efficacy in vocational education and training. For this purpose, a marker-based AR application, called HardwareAR, was developed. HardwareAR provides information about characteristics of hardware components, ports and assembly. The research design was quasi experimental with pre-test post-test that included a control group. The study was conducted with 46 undergraduate students in the Computer Hardware Course. Computer hardware course achievement test, motherboard assembly self-efficacy questionnaire and unstructured observation form were used in the study for data collection purposes. The control group learned the theoretical and applied information about motherboard assembly by using their textbooks (print material) while students in the experimental group used HardwareAR application for the same purpose. It was found that the use of AR had a positive impact on student achievement in motherboard assembly whereas it had no impact on students’ self-efficacy related to theoretical knowledge and assembly skills. On the other hand use of AR helped learners to complete the assembly process in a shorter time with less support. It is concluded that compared to control group students, experimental group students were more successful in computer hardware courses. This result shows that AR application can be effective in increasing achievement. It was concluded that AR application had no effect on students’ motherboard assembly theoretical knowledge self-efficacy and motherboard assembly skills self-efficacy. This result may have been affected from the fact that students had high levels of theoretical knowledge and assembly skills before the implementation. Observations showed that AR application enabled students to assemble motherboard in a shorter time with less support. It is thought that simultaneous interaction between virtual objects and real world provided by the AR application is effective in reducing assembly time. The students who were able to see the process steps and instructions directly with the help of HardwareAR application could complete the assembly by getting less help. Considering these results, it can be argued that, thanks to simultaneous interaction it provides, AR offers an important alternative for topics that need learner application and practice

    Comparative study of AR versus video tutorials for minor maintenance operations

    Full text link
    [EN] Augmented Reality (AR) has become a mainstream technology in the development of solutions for repair and maintenance operations. Although most of the AR solutions are still limited to specific contexts in industry, some consumer electronics companies have started to offer pre-packaged AR solutions as alternative to video-based tutorials (VT) for minor maintenance operations. In this paper, we present a comparative study of the acquired knowledge and user perception achieved with AR and VT solutions in some maintenance tasks of IT equipment. The results indicate that both systems help users to acquire knowledge in various aspects of equipment maintenance. Although no statistically significant differences were found between AR and VT solutions, users scored higher on the AR version in all cases. Moreover, the users explicitly preferred the AR version when evaluating three different usability and satisfaction criteria. For the AR version, a strong and significant correlation was found between the satisfaction and the achieved knowledge. Since the AR solution achieved similar learning results with higher usability scores than the video-based tutorials, these results suggest that AR solutions are the most effective approach to substitute the typical paper-based instructions in consumer electronics.This work has been supported by Spanish MINECO and EU ERDF programs under grant RTI2018-098156-B-C55.Morillo, P.; García García, I.; Orduña, JM.; Fernández, M.; Juan, M. (2020). Comparative study of AR versus video tutorials for minor maintenance operations. Multimedia Tools and Applications. 79(11-12):7073-7100. https://doi.org/10.1007/s11042-019-08437-9S707371007911-12Ahn J, Williamson J, Gartrell M, Han R, Lv Q, Mishra S (2015) Supporting healthy grocery shopping via mobile augmented reality. ACM Trans Multimedia Comput Commun Appl 12(1s):16:1–16:24. https://doi.org/10.1145/2808207Anderson TW (2011) Anderson–darling tests of goodness-of-fit. Springer, Berlin, pp 52–54. https://doi.org/10.1007/978-3-642-04898-2_118Awad N, Lewandowski SE, Decker EW (2015) Event management system for facilitating user interactions at a venue. US Patent App. 14/829,382Azuma R (1997) A survey of augmented reality. Presence: Teleoperators and Virtual Environments 6(4):355–385Baird K, Barfield W (1999) Evaluating the effectiveness of augmented reality displays for a manual assembly task. Virtual Reality 4:250–259Ballo P (2018) Hardware and software for ar/vr development. In: Augmented and virtual reality in libraries, pp 45–55. LITA guidesBarrile V, Fotia A, Bilotta G (2018) Geomatics and augmented reality experiments for the cultural heritage. Applied Geomatics. https://doi.org/10.1007/s12518-018-0231-5Billinghurst M, Duenser A (2012) Augmented reality in the classroom. Computer 45(7):56–63. https://doi.org/10.1109/MC.2012.111Bowman DA, McMahan RP (2007) Virtual reality: how much immersion is enough? Computer 40(7)Brown TA (2015) Confirmatory factor analysis for applied research. Guilford PublicationsDodge Y. (ed) (2008) Kruskal-Wallis test. Springer, New York. https://doi.org/10.1007/978-0-387-32833-1_216Elmunsyah H, Hidayat WN, Asfani K (2019) Interactive learning media innovation: utilization of augmented reality and pop-up book to improve user’s learning autonomy. J Phys Conf Ser 1193(012):031. https://doi.org/10.1088/1742-6596/1193/1/012031Entertainment L (2017) Dolphin Player. https://play.google.com/store/apps/details?id=com.broov.player. Online; Accessed 09-September-2017Fletcher J, Belanich J, Moses F, Fehr A, Moss J (2017) Effectiveness of augmented reality & augmented virtuality. In: MODSIM Modeling & simulation of systems and applications) world conferenceFraga-Lamas P, Fernández-Caramés TM, Blanco-Novoa O, Vilar-Montesinos MA (2018) A review on industrial augmented reality systems for the industry 4.0 shipyard. IEEE Access 6:13,358–13,375. https://doi.org/10.1109/ACCESS.2018.2808326Furió D, Juan MC, Seguí I, Vivó R (2015) Mobile learning vs. traditional classroom lessons: a comparative study. J Comput Assist Learn 31(3):189–201. https://doi.org/10.1111/jcal.12071Gavish N, Gutiérrez T, Webel S, Rodríguez J, Peveri M, Bockholt U, Tecchia F (2015) Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact Learn Environ 23(6):778–798. https://doi.org/10.1080/10494820.2013.815221Gimeno J, Morillo P, Orduña JM, Fernández M (2013) A new ar authoring tool using depth maps for industrial procedures. Comput Ind 64(9):1263–1271. https://doi.org/10.1016/j.compind.2013.06.012Holzinger A, Kickmeier-Rust MD, Albert D (2008) Dynamic media in computer science education; content complexity and learning performance: is less more? Educational Technology & Society 11(1):279–290Hornbaek K (2013) Some whys and hows of experiments in human–computer interaction. Foundations and TrendsⓇ in Human–Computer Interaction 5(4):299–373. https://doi.org/10.1561/1100000043Huang J, Liu S, Xing J, Mei T, Yan S (2014) Circle & search: Attribute-aware shoe retrieval. ACM Trans Multimedia Comput Commun Appl 11 (1):3:1–3:21. https://doi.org/10.1145/2632165Jiang S, Wu Y, Fu Y (2018) Deep bidirectional cross-triplet embedding for online clothing shopping. ACM Trans Multimedia Comput Commun Appl 14(1):5:1–5:22. https://doi.org/10.1145/3152114Kim SK, Kang SJ, Choi YJ, Choi MH, Hong M (2017) Augmented-reality survey: from concept to application. KSII Transactions on Internet and Information Systems 11:982–1004. https://doi.org/10.3837/tiis.2017.02.019Langlotz T, Zingerle M, Grasset R, Kaufmann H, Reitmayr G (2012) Ar record&replay: Situated compositing of video content in mobile augmented reality. In: Proceedings of the 24th Australian Computer-Human Interaction Conference, OzCHI ’12. ACM, New York, pp 318–326, DOI https://doi.org/10.1145/2414536.2414588, (to appear in print)Martin-SanJose JF, Juan MC, Mollá R, Vivó R (2017) Advanced displays and natural user interfaces to support learning. Interact Learn Environ 25(1):17–34. https://doi.org/10.1080/10494820.2015.1090455Massey FJ (1951) The kolmogorov-Smirnov test for goodness of fit. J Am Stat Assoc 46(253):68–78van der Meij H, van der Meij J, Voerman T, Duipmans E (2018) Supporting motivation, task performance and retention in video tutorials for software training. Educ Technol Res Dev 66(3):597–614. https://doi.org/10.1007/s11423-017-9560-zvan der Meij J, van der Meij H (2015) A test of the design of a video tutorial for software training. J Comput Assist Learn 31 (2):116–132. https://doi.org/10.1111/jcal.12082Mestre LS (2012) Student preference for tutorial design: a usability study. Ref Serv Rev 40(2):258–276. https://doi.org/10.1108/00907321211228318Mohr P, Kerbl B, Donoser M, Schmalstieg D, Kalkofen D (2015) Retargeting technical documentation to augmented reality. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, CHI ’15. ACM, New York, pp 3337–3346, DOI https://doi.org/10.1145/2702123.2702490, (to appear in print)Mohr P, Mandl D, Tatzgern M, Veas E, Schmalstieg D, Kalkofen D (2017) Retargeting video tutorials showing tools with surface contact to augmented reality. In: Proceedings of the 2017 CHI conference on human factors in computing systems, CHI ’17. ACM, New York, pp 6547–6558, DOI https://doi.org/10.1145/3025453.3025688, (to appear in print)Montgomery DC, Runger GC (2003) Applied statistics and probability for engineers. Wiley, New YorkMorillo P, Orduña JM, Casas S, Fernández M (2019) A comparison study of ar applications versus pseudo-holographic systems as virtual exhibitors for luxury watch retail stores. Multimedia Systems. https://doi.org/10.1007/s00530-019-00606-yMorse JM (2000) Determining sample size. Qual Health Res 10(1):3–5. https://doi.org/10.1177/104973200129118183Muñoz-Montoya F, Juan M, Mendez-Lopez M, Fidalgo C (2019) Augmented reality based on slam to assess spatial short-term memory. IEEE Access 7:2453–2466. https://doi.org/10.1109/ACCESS.2018.2886627Neuhäuser M (2011) Wilcoxon–Mann–Whitney test. Springer, Berlin, pp 1656–1658Neumann U, Majoros A (1998) Cognitive, performance, and systems issues for augmented reality applications in manufacturing and maintenance. In: Inproceedings of the IEEE virtual reality annual international symposium (VR ’98), pp 4–11no JJA, Juan MC, Gil-Gómez JA, Mollá R. (2014) A comparative study using an autostereoscopic display with augmented and virtual reality. Behaviour & Information Technology 33(6):646–655. https://doi.org/10.1080/0144929X.2013.815277Palmarini R, Erkoyuncu JA, Roy R, Torabmostaedi H (2018) A systematic review of augmented reality applications in maintenance. Robot Comput Integr Manuf 49:215–228Quint F, Loch F (2015) Using smart glasses to document maintenance processes. Mensch und Computer 2015–WorkshopbandRadkowski R, Herrema J, Oliver J (2015) Augmented reality-based manual assembly support with visual features for different degrees of difficulty. International Journal of Human–Computer Interaction 31(5):337–349. https://doi.org/10.1080/10447318.2014.994194Regenbrecht H, Schubert T (2002) Measuring presence in augmented reality environments: design and a first test of a questionnaire, Porto, PortugalRobertson J (2012) Likert-type scales, statistical methods, and effect sizes. Commun ACM 55(5):6–7. https://doi.org/10.1145/2160718.2160721Rodríguez-Andrés D, Juan MC, Méndez-López M, Pérez-Hernández E, Lluch J (2016) Mnemocity task: Assessment of childrens spatial memory using stereoscopy and virtual environments. PLos ONE 1(8). https://doi.org/10.1371/journal.pone.0161858Sanna A, Manuri F, Lamberti F, Paravati G, Pezzolla P (2015) Using handheld devices to support augmented reality-based maintenance and assembly tasks. In: 2015 IEEE International conference on consumer electronics (ICCE), pp. 178–179. https://doi.org/10.1109/ICCE.2015.7066370Schmidt S, Ehrenbrink P, Weiss B, Voigt-Antons J, Kojic T, Johnston A, Moller S (2018) Impact of virtual environments on motivation and engagement during exergames. In: 2018 Tenth international conference on quality of multimedia experience (qoMEX), pp 1–6. https://doi.org/10.1109/QoMEX.2018.8463389Shapiro SS, Wilk MB (1965) An analysis of variance test for normality (complete samples). Biometrika 52(3/4):591–611Tang A, Owen C, Biocca F, Mou W (2003) Comparative effectiveness of augmented reality in object assembly. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’03. ACM, New York, pp 73–80, DOI https://doi.org/10.1145/642611.642626, (to appear in print)Tomás JM, Oliver A, Galiana L, Sancho P, Lila M (2013) Explaining method effects associated with negatively worded items in trait and state global and domain-specific self-esteem scales. Structural Equation Modeling: A Multidisciplinary Journal 20(2):299–313. https://doi.org/10.1080/10705511.2013.769394Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2017) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. The Int J Adv Manuf Technol: 1–13Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Advances in Manufacturing 4(1):1–22. https://doi.org/10.1007/s40436-015-0131-4Westerfield G, Mitrovic A, Billinghurst M (2015) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25(1):157–172. https://doi.org/10.1007/s40593-014-0032-xWiedenmaier S, Oehme O, Schmidt L, Luczak H (2003) Augmented reality (ar) for assembly processes - design and experimental evaluation. International Journal of Human-Computer Interaction 16(3):497–514Witmer BG, Singer MJ (1998) Measuring presence in virtual environments: a presence questionnaire. Presence: Teleoperators and Virtual Environments 7(3):225–240Wu HK, Lee SWY, Chang HY, Liang JC (2013) Current status, opportunities and challenges of augmented reality in education. Computers & Education 62:41–49. https://doi.org/10.1016/j.compedu.2012.10.024Yim MYC, Chu SC, Sauer PL (2017) Is augmented reality technology an effective tool for e-commerce? an interactivity and vividness perspective. Journal of Interactive Marketing 39(http://www.sciencedirect.com/science/article/pii/S1094996817300336):89–103. https://doi.org/10.1016/j.intmar.2017.04.001Yuan ML, Ong SK, Nee AYC (2008) Augmented reality for assembly guidance using a virtual interactive tool. Int J Prod Res 46(7):1745–1767. https://doi.org/10.1080/0020754060097293

    An evaluation of the Microsoft HoloLens for a manufacturing-guided assembly task

    Get PDF
    Many studies have confirmed the benefits of using Augmented Reality (AR) work instructions over traditional digital or paper instructions, but few have compared the effects of different AR hardware for complex assembly tasks. For this research, previously published data using Desktop Model Based Instructions (MBI), Tablet MBI, and Tablet AR instructions were compared to new assembly data collected using AR instructions on the Microsoft HoloLens Head Mounted Display (HMD). Participants completed a mock wing assembly task, and measures like completion time, error count, Net Promoter Score, and qualitative feedback were recorded. The HoloLens condition yielded faster completion times than all other conditions. HoloLens users also had lower error rates than those who used the non-AR conditions. Despite the performance benefits of the HoloLens AR instructions, users of this condition reported lower net promoter scores than users of the Tablet AR instructions. The qualitative data showed that some users thought the HoloLens device was uncomfortable and that the tracking was not always exact. Although the user feedback favored the Tablet AR condition, the HoloLens condition resulted in significantly faster assembly times. As a result, it is recommended to use the HoloLens for complex guided assembly instructions with minor changes, such as allowing the user to toggle the AR instructions on and off at will. The results of this paper can help manufacturing stakeholders better understand the benefits of different AR technology for manual assembly tasks

    Augmenting Reality with Intelligent Interfaces

    Get PDF
    It is clear that our daily reality will increasingly interface with virtual inputs. We already integrate the virtual into real life through constantly evolving sensor technologies embedded into our smartphones, digital assistants, and connected devices. Simultaneously, we seek more virtual input into our reality through intelligent interfaces for the applications that these devices can run in a context rich, socially connected, and personalized way. As we progress toward a future of ubiquitous Augmented Reality (AR) interfaces, it will be important to consider how this technology can best serve the various populations that can benefit most from the addition of these intelligent interfaces. This paper proposes a new terminological framework to discuss the way AR interacts with users. An intelligent interface that combines digital objects in a real-world context can be referred to as a Pose-Interfaced Presentation (PIP): Pose refers to user location and orientation in space; Interfaced means that the program responds to a user’s intention and actions in an intelligent way; and Presentation refers to the virtual object or data being layered onto the perceptive field of the user. Finally, various benefits of AR are described and examples are provided in the areas of education, worker training, and ESL learning

    real time assistance to manual assembly through depth camera and visual feedback

    Get PDF
    Abstract The current fourth industrial revolution significantly impacts on production processes. The personalized production paradigm enables customers to order unique products. The operators assemble an enormous component variety adapting their process from product to product with limited learning opportunities. Digital technologies are increasingly adopted in production processes to improve performance and quality. Considering this framework, this research proposes a hardware/software architecture to assist in real-time operators involved in manual assembly processes. A depth camera captures human motions in relation with the workstation environment whereas a visual feedback guides the operator through consecutive assembly tasks. An industrial case study validates the architecture

    Are Reality, Simulation, and Augmented Reality Interchangeable?

    Get PDF
    Students often ask why they should learn or where they would use this knowledge when learning. Real-life experiences make learning more meaningful for the students. Thus, learning environments where the students could acquire real-life experiences are important. However, due to the student profile, crowded classes, inadequate course hours, technological advances, natural disasters, etc., conventional instruction methods could not meet student requirements and they could not practice. This negatively affects learning achievements and psychomotor skills of the students. Effective real-life educational experiences are required to improve learning achievements and psychomotor skills of the students. Thus, the present study aimed to investigate learning achievement and psychomotor skills levels of college students in the ICT course and substitution of augmented reality applications and simulations with real-life experiences. The study data were collected from 63 college students. Descriptive statistics, two-way ANOVA, and Wilcoxon Signed Rank Test analysis were employed to answer the research questions. The findings demonstrated that augmented reality and simulation-assisted learning environments were as effective as real-life learning environments in the improvement of the learning achievements and psychomotor skills of the students in the ICT course. Thus, it could be suggested that augmented reality or simulation applications could be employed in learning environments that lack real-life experiences

    Current challenges and future research directions in augmented reality for education

    Get PDF
    The progression and adoption of innovative learning methodologies signify that a respective part of society is open to new technologies and ideas and thus is advancing. The latest innovation in teaching is the use of Augmented Reality (AR). Applications using this technology have been deployed successfully in STEM (Science, Technology, Engineering, and Mathematics) education for delivering the practical and creative parts of teaching. Since AR technology already has a large volume of published studies about education that reports advantages, limitations, effectiveness, and challenges, classifying these projects will allow for a review of the success in the different educational settings and discover current challenges and future research areas. Due to COVID-19, the landscape of technology-enhanced learning has shifted more toward blended learning, personalized learning spaces and user-centered approach with safety measures. The main findings of this paper include a review of the current literature, investigating the challenges, identifying future research areas, and finally, reporting on the development of two case studies that can highlight the first steps needed to address these research areas. The result of this research ultimately details the research gap required to facilitate real-time touchless hand interaction, kinesthetic learning, and machine learning agents with a remote learning pedagogy

    Distributed Self-Deployment in Visual Sensor Networks

    Get PDF
    Autonomous decision making in a variety of wireless sensor networks, and also in visual sensor networks (VSNs), specifically, has become a highly researched field in recent years. There is a wide array of applications ranging from military operations to civilian environmental monitoring. To make VSNs highly useful in any type of setting, a number of fundamental problems must be solved, such as sensor node localization, self-deployment, target recognition, etc. This presents a plethora of challenges, as low cost, low energy consumption, and excellent scalability are desired. This thesis describes the design and implementation of a distributed self-deployment method in wireless visual sensor networks. Algorithms are developed for the imple- mentation of both centralized and distributed self-deployment schemes, given a set of randomly placed sensor nodes. In order to self-deploy these nodes, the fundamental problem of localization must first be solved. To this end, visual structured marker detection is utilized to obtain coordinate data in reference to artificial markers, which then is used to deduct the location of a node in an absolute coordinate system. Once localization is complete, the nodes in the VSN are deployed in either centralized or distributed fashion, to pre-defined target locations. As is usually the case, in cen- tralized mode there is a single processing node which makes the vast majority of decisions, and since this one node has knowledge of all events in the VSN, it is able to make optimal decisions, at the expense of time and scalability. The distributed mode, however, offers increased performance in regard to time and scalability, but the final deployment result may be considered sub-optimal. Software is developed for both modes of operations, and a GUI is provided as an easy control interface, which also allows for visualization of the VSN progress in the testing environment. The algorithms are tested on an actual testbed consisting of five custom-built Mobile Sensor Platforms (MSPs). The MSPs are configured to have a camera and an ultra-sonic range sensor. The visual marker detection uses the camera, and for obstacle avoidance during motion, the sonic ranger is used. Eight markers are placed in an area measuring 4 Ă— 4 meters, which is surrounded by white background. Both algorithms are evaluated for speed and accuracy. Experimental results show that localization using the visual markers has an accuracy of about 96% in ideal lighting conditions, and the proposed self-deployment algorithms perform as desired. The MSPs suffer from some physical design limitations, such as lacking wheel encoders for reliable movement in straight lines. Experiments show that over 1 meter of travel the MSPs deviate from the path by an average of 7.5 cm in a lateral direction. Finally, the time needed for each algorithm to complete is recorded, and it is found that centralized and distributed modes require an average of 34.3 and 28.6 seconds, respectively, effectively meaning that distributed self-deployment is approximately 16.5% faster than centralized deployment

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/
    • …
    corecore