10,670 research outputs found

    Designing mobile augmented reality art applications:addressing the views of the galleries and the artists

    Get PDF
    The utilization of mobile augmented reality to display gallery artworks or museum content in novel ways is a well-established concept in the augmented reality research community. However, the focus of these systems is generally technologically driven or only addresses the end user and not the views of the gallery or the original artist. In this paper we discuss the design and development of the mobile application ?Taking the Artwork Home?, which allows people to digitally curate their own augmented reality art exhibitions in their own homes by digitally ?replacing? the pictures they have on their walls with content from the Peter Scott Gallery in Lancaster. In particular, we present the insights gained from a research through design methodology that allowed us to consider how the views of the gallery and artists impacted on the system design and therefore the user experience. Thus the final artifact is the result of an iterative evaluation process with over 100 users representing a broad range of demographics and continues to be evaluated/enhanced by observing its operation ?in the wild?. Further, we consider the effect the project has had on gallery practices to enable both augmented reality designers, and galleries and museums to maximize the potential application of the technology when working together on such project

    GPU-based Image Analysis on Mobile Devices

    Get PDF
    With the rapid advances in mobile technology many mobile devices are capable of capturing high quality images and video with their embedded camera. This paper investigates techniques for real-time processing of the resulting images, particularly on-device utilizing a graphical processing unit. Issues and limitations of image processing on mobile devices are discussed, and the performance of graphical processing units on a range of devices measured through a programmable shader implementation of Canny edge detection.Comment: Proceedings of Image and Vision Computing New Zealand 201

    A component-based approach towards mobile distributed and collaborative PTAM

    Get PDF
    Having numerous sensors on-board, smartphones have rapidly become a very attractive platform for augmented reality applications. Although the computational resources of mobile devices grow, they still cannot match commonly available desktop hardware, which results in downscaled versions of well known computer vision techniques that sacrifice accuracy for speed. We propose a component-based approach towards mobile augmented reality applications, where components can be configured and distributed at runtime, resulting in a performance increase by offloading CPU intensive tasks to a server in the network. By sharing distributed components between multiple users, collaborative AR applications can easily be developed. In this poster, we present a component-based implementation of the Parallel Tracking And Mapping (PTAM) algorithm, enabling to distribute components to achieve a mobile, distributed version of the original PTAM algorithm, as well as a collaborative scenario

    It's the Human that Matters: Accurate User Orientation Estimation for Mobile Computing Applications

    Full text link
    Ubiquity of Internet-connected and sensor-equipped portable devices sparked a new set of mobile computing applications that leverage the proliferating sensing capabilities of smart-phones. For many of these applications, accurate estimation of the user heading, as compared to the phone heading, is of paramount importance. This is of special importance for many crowd-sensing applications, where the phone can be carried in arbitrary positions and orientations relative to the user body. Current state-of-the-art focus mainly on estimating the phone orientation, require the phone to be placed in a particular position, require user intervention, and/or do not work accurately indoors; which limits their ubiquitous usability in different applications. In this paper we present Humaine, a novel system to reliably and accurately estimate the user orientation relative to the Earth coordinate system. Humaine requires no prior-configuration nor user intervention and works accurately indoors and outdoors for arbitrary cell phone positions and orientations relative to the user body. The system applies statistical analysis techniques to the inertial sensors widely available on today's cell phones to estimate both the phone and user orientation. Implementation of the system on different Android devices with 170 experiments performed at different indoor and outdoor testbeds shows that Humaine significantly outperforms the state-of-the-art in diverse scenarios, achieving a median accuracy of 15∘15^\circ averaged over a wide variety of phone positions. This is 558%558\% better than the-state-of-the-art. The accuracy is bounded by the error in the inertial sensors readings and can be enhanced with more accurate sensors and sensor fusion.Comment: Accepted for publication in the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (Mobiquitous 2014
    • 

    corecore