1,330 research outputs found

    The 16th international symposium on wearable computers, ISWC 2012, adjunct proceedings, Newcastle Upon Tyne, UK, June 18-22 2012

    Get PDF

    A pervasive augmented reality serious game

    Get PDF
    This paper presents a pervasive augmented reality serious game that can be used to enhance entertainment using a multimodal tracking interface. The main objective of the research is to design and implement generic pervasive interfaces that are user-friendly and can be used by a wide range of users including people with disabilities. A pervasive AR racing game has been designed and implemented. The goal of the game is to start the car and move around the track without colliding with either the wall or the objects that exist in the gaming arena. Users can interact using a pinch glove, a Wiimote, through tangible ways as well as through I/O controls of the UMPC. Initial evaluation results showed that multimodal-based interaction games can be beneficial in serious games

    Master Hand Technology For The HMI Using Hand Gesture And Colour Detection

    Get PDF
    Master Hand Technology uses different hand gestures and colors to give various commands for the Human-Machine(here Computer) Interfacing. Gestures recognition deals with the goal of interpreting human gestures via mathematical algorithm. Gestures made by users with the help of a color band and/or body pose , in two or three dimensions , get translated by software/image processing into predefined commands .The computer then acts according to the command. There have been a lot work already developed in this field either by extracting hand gesture only or extracting hand with the help of color segmentation. In this project, both hand gesture extraction and color detection used for better, faster, robust, accurate and real-time applications. Red, Green, Blue colors are most efficiently detected if RGB color space used. Using HSV color space, it can be extended to any no of colors. For hand gesture detection, the default background is captured and stored for further processing. Comparing the new captured image with background image and doing necessary extraction and filtering, hand portion can be extracted. Then applying different mathematical algorithms different hand gestures detected. All this work done using MATLAB software. By interfacing a portion of Master hand or/and color to mouse of a Computer, the computer can be controlled same as the mouse. And then many virtual (Augmented reality) or PC based application can be developed (e.g. Calculator, Paint). It does not matter whether the system is within your reach or not; but a camera that is linked with the system must have to be near-by . Showing different gestures by your Master-Hand , the computer can be controlled remotely. If the camera can be set-up online, then the computer can be controlled even from a very far place online

    Collaborative Augmented Reality

    Get PDF
    Over the past number of years augmented reality (AR) has become an increasingly pervasive as a consumer level technology. The principal drivers of its recent development has been the evolution of mobile and handheld devices, in conjunction with algorithms and techniques from fields such as 3D computer vision. Various commercial platforms and SDKs are now available that allow developers to quickly develop mobile AR apps requiring minimal understanding of the underlying technology. Much of the focus to date, both in the research and commercial environment, has been on single user AR applications. Just as collaborative mobile applications have a demonstrated role in the increasing popularity of mobile devices, and we believe collaborative AR systems present a compelling use-case for AR technology. The aim of this thesis is the development a mobile collaborative augmented reality framework. We identify the elements required in the design and implementation stages of collaborative AR applications. Our solution enables developers to easily create multi-user mobile AR applications in which the users can cooperatively interact with the real environment in real time. It increases the sense of collaborative spatial interaction without requiring complex infrastructure. Assuming the given low level communication and AR libraries have modular structures, the proposed approach is also modular and flexible enough to adapt to their requirements without requiring any major changes
    corecore