6,402 research outputs found

    The Limited Effect of Graphic Elements in Video and Augmented Reality on Children’s Listening Comprehension

    Get PDF
    There is currently significant interest in the use of instructional strategies in learning environments thanks to the emergence of new multimedia systems that combine text, audio, graphics and video, such as augmented reality (AR). In this light, this study compares the effectiveness of AR and video for listening comprehension tasks. The sample consisted of thirty-two elementary school students with different reading comprehension. Firstly, the experience, instructions and objectives were introduced to all the students. Next, they were divided into two groups to perform activities—one group performed an activity involving watching an Educational Video Story of the Laika dog and her Space Journey available by mobile devices app Blue Planet Tales, while the other performed an activity involving the use of AR, whose contents of the same history were visualized by means of the app Augment Sales. Once the activities were completed participants answered a comprehension test. Results (p = 0.180) indicate there are no meaningful differences between the lesson format and test performance. But there are differences between the participants of the AR group according to their reading comprehension level. With respect to the time taken to perform the comprehension test, there is no significant difference between the two groups but there is a difference between participants with a high and low level of comprehension. To conclude SUS (System Usability Scale) questionnaire was used to establish the measure usability for the AR app on a smartphone. An average score of 77.5 out of 100 was obtained in this questionnaire, which indicates that the app has fairly good user-centered design

    Adaptive User Perspective Rendering for Handheld Augmented Reality

    Full text link
    Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the user's real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the user's head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the user's motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering

    ENHANCING USERS’ EXPERIENCE WITH SMART MOBILE TECHNOLOGY

    Get PDF
    The aim of this thesis is to investigate mobile guides for use with smartphones. Mobile guides have been successfully used to provide information, personalisation and navigation for the user. The researcher also wanted to ascertain how and in what ways mobile guides can enhance users' experience. This research involved designing and developing web based applications to run on smartphones. Four studies were conducted, two of which involved testing of the particular application. The applications tested were a museum mobile guide application and a university mobile guide mapping application. Initial testing examined the prototype work for the ‘Chronology of His Majesty Sultan Haji Hassanal Bolkiah’ application. The results were used to assess the potential of using similar mobile guides in Brunei Darussalam’s museums. The second study involved testing of the ‘Kent LiveMap’ application for use at the University of Kent. Students at the university tested this mapping application, which uses crowdsourcing of information to provide live data. The results were promising and indicate that users' experience was enhanced when using the application. Overall results from testing and using the two applications that were developed as part of this thesis show that mobile guides have the potential to be implemented in Brunei Darussalam’s museums and on campus at the University of Kent. However, modifications to both applications are required to fulfil their potential and take them beyond the prototype stage in order to be fully functioning and commercially viable

    A MOBILE PLATFORM FOR LOCATION-BASED SERVICE APPLICATIONS USING AUGMENTED REALITY: ONLINE MAP, TRACKING AND NAVIGATION ON GOOGLE ANDROID SMARTPHONE DEVICE (TOC, Abstract, Chapter 1 and Reference only)

    Get PDF
    This project paper is about Augmented Reality (AR) using location-based visualization and implementation on the smartphone devices. That is partly because smartphone comes packed with built-in sensors have grown and become popular over the years. This will explore the interactive and interaction Location Based Services that AR allows on Android devices. The use of mobile applications and advancement in mobile technology such as Global Positioning System (GPS), compass and accelerometer sensors are able to identify and determine the location and orientation of the device, location-based applications with augmented reality views are possible. AR combines the real world with virtual, the integration of information in the user's environment in real time, the user interaction techniques of representing rich, intuitive information data of the real world. The AR application which typically takes the image of the integrated camera, positioning location as a representation of the real world and project objects on top of this image to create the AR view. The research was initiated by exploring and reviewing literature related domain and existing AR application available on Android devices. There are a number of AR applications available and the rapid development of Android smartphone devices has provided an improved platform for the application of mobile AR technologies. Developing application will help the researcher explore the topic while going through this technology. The aim of this study is to develop a combination of location-based information and AR features by blending both visual, map-based and nonmap based elements like live projection of a nearby landmark on camera preview on mobile devices, utilizing free and open source software development tools. In the context of this paper a prototype application, based on the Android platform and Mixare engine library is developed. This paper showed the initial thoughts on this application and overall process that leads to the final system development. This report describes MyARTGuide, a prototype application of augmented reality designed to be run over Android based smartphone. The user can now look through their phone as if taking a picture to look at the augmented world which leads to a better user experience. MyARTGuide is developed for experiment, simulation and to test the AR functionality of these project objectives is not a fully functional product. Thus, there are still more areas that can be improved and new features can be added. With the use of AR and Andorid technology it is possible to spread the experience which will be shown in this report. (ABSTRACT BY AUTHOR

    The Challenges of Evaluating the Usability of Augmented Reality (AR)

    Get PDF
    Augmented reality (AR) is a new and emerging technology that could benefit from evaluating its usability to better the user’s experience with the device or application. This is often done through usability testing and heuristic evaluations. However, AR technology presents some challenges when completing these usability evaluations. Practitioners need to keep in mind the hardware limitations of AR devices that may not be present with other computerized technology, consistency of the users’ environment plays a larger role in the AR experience, recognize that a novelty effect may occur and affect subjective scores, and choose heuristic sets that will best evaluate AR applications. Practitioners need to be aware of these challenges and overcome them to accurately assess the usability of these products to gain insights about what should be changed to make the overall experience with the product better

    An effective approach to develop location-based augmented reality information support

    Get PDF
    Using location-based augmented reality (AR) for pedestrian navigation can greatly improve user action to reduce the travel time. Pedestrian navigation differs in many ways from the conventional navigation system used in a car or other vehicles. A major issue with using location-based AR for navigation to a specific landmark is their quality of usability, especially if the active screen is overcrowded with the augmented POI markers which were overlap each other at the same time. This paper describes the user journey map approach that led to new insights about how users were using location-based AR for navigation. These insights led to a deep understanding of challenges that user must face when using location-based AR application for pedestrian navigation purpose, and more generally, they helped the development team to appreciate the variety of user experience in software requirement specification phase. To prove our concept, a prototype of intuitive location-based AR was built to be compared with existing standard-location based AR. The user evaluation results reveal that the overall functional requirements which are gathered from user journey have same level of success rate criteria when compared with standard location-based AR. Nevertheless, the field study participants highlighted the extended features in our prototype could significantly enhance the user action on locating the right object in particular place when compared with standard location-based AR application (proved with the required time)

    Visualising mixed reality simulation for multiple users

    Get PDF
    Cowling, MA ORCiD: 0000-0003-1444-1563Blended reality seeks to encourage co-presence in the classroom, blending student experience across virtual and physical worlds. In a similar way, Mixed Reality, a continuum between virtual and real environments, is now allowing learners to work in both the physical and the digital world simultaneously, especially when combined with an immersive headset experience. This experience provides innovative new experiences for learning, but faces the challenge that most of these experiences are single user, leaving others outside the new environment. The question therefore becomes, how can a mixed reality simulation be experienced by multiple users, and how can we present that simulation effectively to users to create a true blended reality environment? This paper proposes a study that uses existing screen production research into the user and spectator to produce a mixed reality simulation suitable for multiple users. A research method using Design Based Research is also presented to assess the usability of the approach
    • 

    corecore