16,325 research outputs found

    Interactive augmented reality

    Get PDF
    Projecte final de carrera realitzat en col.laboració amb el Royal Institute of TechnologyAugmented reality can provide a new experience to users by adding virtual objects where they are relevant in the real world. The new generation of mobile phones offers a platform to develop augmented reality application for industry as well as for the general public. Although some applications are reaching commercial viability, the technology is still limited. The main problem designers have to face when building an augmented reality application is to implement an interaction method. Interacting through the mobile's keyboard can prevent the user from looking on the screen. Normally, mobile devices have small keyboards, which are difficult to use without looking at them. Displaying a virtual keyboard on the screen is not a good solution either as the small screen is used to display the augmented real world. This thesis proposes a gesture-based interaction approach for this kind of applications. The idea is that by holding and moving the mobile phone in different ways, users are able to interact with virtual content. This approach combines the use of input devices as keyboards or joysticks and the detection of gestures performed with the body into one scenario: the detection of the phone's movements performed by users. Based on an investigation of people's own preferred gestures, a repertoire of manipulations was defined and used to implement a demonstrator application running on a mobile phone. This demo was tested to evaluate the gesture-based interaction within an augmented reality application. The experiment shows that it is possible to implement and use gesture-based interaction in augmented reality. Gestures can be designed to solve the limitations of augmented reality and offer a natural and easy to learn interaction to the user

    A comparison of surface and motion user-defined gestures for mobile augmented reality.

    Get PDF
    Augmented Reality (AR) technology permits interaction between the virtual and physical worlds. Recent advancements in mobile devices allow for a better mobile AR experience, and in turn, improving user adoption rate and increasing the number of mobile AR applications across a wide range of disciplines. Nevertheless, the majority of mobile AR applications, that we have surveyed, adopted surface gestures as the default interaction method for the AR experience and have not utilised three-dimensional (3D) spatial interaction, as supported by AR interfaces. This research investigates two types of gestures for interacting in mobile AR applications, surface gestures, which have been deployed by mainstream applications, and motion gestures, that take advantages of 3D movement of the handheld device. Our goal is to find out if there exists a gesture-based interaction suitable for handheld devices, that can utilise the 3D interaction of mobile AR applications. We conducted two user studies, an elicitation study and a validation study. In the elicitation study, we elicited two sets of gestures, surface and motion, for mobile AR applications. We recruited twenty-one participants to perform twelve common mobile AR tasks, which yielded a total of five-hundred and four gestures. We classified and illustrated the two sets of gestures, and compared them in terms of goodness, ease of use, and engagement. The elicitation process yielded two separate sets of user-defined gestures; legacy surface gestures, which were familiar and easy to use by the participants, and motion gestures, which found to be more engaging. From the design patterns of the motion gestures, we proposed a novel interaction technique for mobile AR called TMR (Touch-Move-Release). To validate our elicited gestures in an actual application, we conducted a second study. We have developed a mobile AR game similar to Pokémon GO and implemented the selected gestures from the elicitation study. The study was conducted with ten participants, and we found that the motion gesture could provide more engagement and better game experience. Nevertheless, surface gestures were more accurate and easier to use. We discussed the implications of our findings and gave our design recommendations for designers on the usage of the elicited gestures. Our research can be further explored in the future. It can be used as a "prequel" to the design of better gesture-based interaction technique for different tasks in various mobile AR applications

    Interactive augmented reality

    Get PDF
    Projecte final de carrera realitzat en col.laboració amb el Royal Institute of TechnologyAugmented reality can provide a new experience to users by adding virtual objects where they are relevant in the real world. The new generation of mobile phones offers a platform to develop augmented reality application for industry as well as for the general public. Although some applications are reaching commercial viability, the technology is still limited. The main problem designers have to face when building an augmented reality application is to implement an interaction method. Interacting through the mobile's keyboard can prevent the user from looking on the screen. Normally, mobile devices have small keyboards, which are difficult to use without looking at them. Displaying a virtual keyboard on the screen is not a good solution either as the small screen is used to display the augmented real world. This thesis proposes a gesture-based interaction approach for this kind of applications. The idea is that by holding and moving the mobile phone in different ways, users are able to interact with virtual content. This approach combines the use of input devices as keyboards or joysticks and the detection of gestures performed with the body into one scenario: the detection of the phone's movements performed by users. Based on an investigation of people's own preferred gestures, a repertoire of manipulations was defined and used to implement a demonstrator application running on a mobile phone. This demo was tested to evaluate the gesture-based interaction within an augmented reality application. The experiment shows that it is possible to implement and use gesture-based interaction in augmented reality. Gestures can be designed to solve the limitations of augmented reality and offer a natural and easy to learn interaction to the user

    Enhancing cultural tourism by a mixed reality application for outdoor navigation and information browsing using immersive devices

    Get PDF
    In this paper a mixed reality application is introduced; this application runs on Microsoft Hololens and has been designed to provide information on a city scale. The application was developed to provide information about historical buildings, thus supporting cultural outdoor tourism. The huge amount of multimedia data stored in the archives of the Italian public broadcaster RAI, is used to enrich the user experience. A remote application of image and video analysis receives an image flow by the user and identifies known objects framed in the images. The user can select the object (monument/building/artwork) for which augmented contents have to be displayed (video, text audio); the user can interact with these contents by a set of defined gestures. Moreover, if the object of interest is detected and tracked by the mixed reality application, also 3D contents can be overlapped and aligned with the real world

    Augmented reality meeting table: a novel multi-user interface for architectural design

    Get PDF
    Immersive virtual environments have received widespread attention as providing possible replacements for the media and systems that designers traditionally use, as well as, more generally, in providing support for collaborative work. Relatively little attention has been given to date however to the problem of how to merge immersive virtual environments into real world work settings, and so to add to the media at the disposal of the designer and the design team, rather than to replace it. In this paper we report on a research project in which optical see-through augmented reality displays have been developed together with prototype decision support software for architectural and urban design. We suggest that a critical characteristic of multi user augmented reality is its ability to generate visualisations from a first person perspective in which the scale of rendition of the design model follows many of the conventions that designers are used to. Different scales of model appear to allow designers to focus on different aspects of the design under consideration. Augmenting the scene with simulations of pedestrian movement appears to assist both in scale recognition, and in moving from a first person to a third person understanding of the design. This research project is funded by the European Commission IST program (IST-2000-28559)

    A software framework for the development of projection-based augmented reality systems

    Get PDF
    Despite the large amount of methods and applications of augmented reality, there is little homogenization on the software platforms that support them. An exception may be the low level control software that is provided by some high profile vendors such as Qualcomm and Metaio. However, these provide fine grain modules for e.g. element tracking. We are more concerned on the application framework, that includes the control of the devices working together for the development of the AR experience. In this paper we present a software framework that can be used for the development of AR applications based on camera-projector pairs, that is suitable for both fixed, and nomadic setups.Peer ReviewedPostprint (author's final draft
    corecore