9,554 research outputs found

    A component-based approach towards mobile distributed and collaborative PTAM

    Get PDF
    Having numerous sensors on-board, smartphones have rapidly become a very attractive platform for augmented reality applications. Although the computational resources of mobile devices grow, they still cannot match commonly available desktop hardware, which results in downscaled versions of well known computer vision techniques that sacrifice accuracy for speed. We propose a component-based approach towards mobile augmented reality applications, where components can be configured and distributed at runtime, resulting in a performance increase by offloading CPU intensive tasks to a server in the network. By sharing distributed components between multiple users, collaborative AR applications can easily be developed. In this poster, we present a component-based implementation of the Parallel Tracking And Mapping (PTAM) algorithm, enabling to distribute components to achieve a mobile, distributed version of the original PTAM algorithm, as well as a collaborative scenario

    Optical coherence tomography-based consensus definition for lamellar macular hole.

    Get PDF
    BackgroundA consensus on an optical coherence tomography definition of lamellar macular hole (LMH) and similar conditions is needed.MethodsThe panel reviewed relevant peer-reviewed literature to reach an accord on LMH definition and to differentiate LMH from other similar conditions.ResultsThe panel reached a consensus on the definition of three clinical entities: LMH, epiretinal membrane (ERM) foveoschisis and macular pseudohole (MPH). LMH definition is based on three mandatory criteria and three optional anatomical features. The three mandatory criteria are the presence of irregular foveal contour, the presence of a foveal cavity with undermined edges and the apparent loss of foveal tissue. Optional anatomical features include the presence of epiretinal proliferation, the presence of a central foveal bump and the disruption of the ellipsoid zone. ERM foveoschisis definition is based on two mandatory criteria: the presence of ERM and the presence of schisis at the level of Henle's fibre layer. Three optional anatomical features can also be present: the presence of microcystoid spaces in the inner nuclear layer (INL), an increase of retinal thickness and the presence of retinal wrinkling. MPH definition is based on three mandatory criteria and two optional anatomical features. Mandatory criteria include the presence of a foveal sparing ERM, the presence of a steepened foveal profile and an increased central retinal thickness. Optional anatomical features are the presence of microcystoid spaces in the INL and a normal retinal thickness.ConclusionsThe use of the proposed definitions may provide uniform language for clinicians and future research

    I Can See Your Aim: Estimating User Attention From Gaze For Handheld Robot Collaboration

    Get PDF
    This paper explores the estimation of user attention in the setting of a cooperative handheld robot: a robot designed to behave as a handheld tool but that has levels of task knowledge. We use a tool-mounted gaze tracking system, which, after modelling via a pilot study, we use as a proxy for estimating the attention of the user. This information is then used for cooperation with users in a task of selecting and engaging with objects on a dynamic screen. Via a video game setup, we test various degrees of robot autonomy from fully autonomous, where the robot knows what it has to do and acts, to no autonomy where the user is in full control of the task. Our results measure performance and subjective metrics and show how the attention model benefits the interaction and preference of users.Comment: this is a corrected version of the one that was published at IROS 201

    HOW TO INTERACT WITH AR HEAD MOUNTED DEVICES IN CARE WORK? A STUDY COMPARING HANDHELD TOUCH (HANDS-ON) AND GESTURE (HANDS-FREE) INTERACTION

    Get PDF
    In this paper, we describe a study investigating augmented reality (AR) to support caregivers. We implemented a system called Care Lenses that supports various care tasks on AR head-mounted devices. For its application, one question was how caregivers could interact with the system while providing care, that is, while using one or both hands for care tasks. Therefore, we compared two mechanisms to interact with the CareLenses (handheld touch similar to touchpads and touchscreens and head gestures). We found that certain head gestures were difficult to apply in practice, but that except from this head gesture support was as usable and useful as handheld touch interaction, although the study participants were much more familiar with the handheld touch control. We conclude that head gestures can be a good means to enable AR support in care, and we provide design considerations to make them more applicable in practice

    Securing Interactive Sessions Using Mobile Device through Visual Channel and Visual Inspection

    Full text link
    Communication channel established from a display to a device's camera is known as visual channel, and it is helpful in securing key exchange protocol. In this paper, we study how visual channel can be exploited by a network terminal and mobile device to jointly verify information in an interactive session, and how such information can be jointly presented in a user-friendly manner, taking into account that the mobile device can only capture and display a small region, and the user may only want to authenticate selective regions-of-interests. Motivated by applications in Kiosk computing and multi-factor authentication, we consider three security models: (1) the mobile device is trusted, (2) at most one of the terminal or the mobile device is dishonest, and (3) both the terminal and device are dishonest but they do not collude or communicate. We give two protocols and investigate them under the abovementioned models. We point out a form of replay attack that renders some other straightforward implementations cumbersome to use. To enhance user-friendliness, we propose a solution using visual cues embedded into the 2D barcodes and incorporate the framework of "augmented reality" for easy verifications through visual inspection. We give a proof-of-concept implementation to show that our scheme is feasible in practice.Comment: 16 pages, 10 figure
    corecore