6 research outputs found

    A Situative Space Model for Mobile Mixed-Reality Computing

    Get PDF

    SenseBelt:a belt-worn sensor to support cross-device interaction

    Get PDF
    Mobile interaction is shifting from a single device to simultaneous interaction with ensembles of devices such as phones, tablets, or watches. Spatially-aware cross-device interaction between mobile devices typically requires a fixed tracking infrastructure, which lim- its mobility. In this paper, we present SenseBelt – a sensing belt that enhances existing mobile interactions and enables low-cost, ad hoc sensing of cross-device gestures and interactions. SenseBelt enables proxemic interactions between people and their personal devices. SenseBelt also supports cross-device interaction be- tween personal devices and stationary devices, such as public displays. We discuss the design and implementation of SenseBelt together with possible applications. With an initial evaluation, we provide insights into the benefits and drawbacks of a belt-worn mediating sensor to support cross-device interactions

    Designing Wearable Personal Assistants for Surgeons: An Egocentric Approach

    Get PDF

    Investigating Precise Control in Spatial Interactions: Proxemics, Kinesthetics, and Analytics

    Get PDF
    Augmented and Virtual Reality (AR/VR) technologies have reshaped the way in which we perceive the virtual world. In fact, recent technological advancements provide experiences that make the physical and virtual worlds almost indistinguishable. However, the physical world affords subtle sensorimotor cues which we subconsciously utilize to perform simple and complex tasks in our daily lives. The lack of this affordance in existing AR/VR systems makes it difficult for their mainstream adoption over conventional 2D2D user interfaces. As a case in point, existing spatial user interfaces (SUI) lack the intuition to perform tasks in a manner that is perceptually familiar to the physical world. The broader goal of this dissertation lies in facilitating an intuitive spatial manipulation experience, specifically for motor control. We begin by investigating the role of proximity to an action on precise motor control in spatial tasks. We do so by introducing a new SUI called the Clock-Maker's Work-Space (CMWS), with the goal of enabling precise actions close to the body, akin to the physical world. On evaluating our setup in comparison to conventional mixed-reality interfaces, we find CMWS to afford precise actions for bi-manual spatial tasks. We further compare our SUI with a physical manipulation task and observe similarities in user behavior across both tasks. We subsequently narrow our focus on studying precise spatial rotation. We utilize haptics, specifically force-feedback (kinesthetics) for augmenting fine motor control in spatial rotational task. By designing three kinesthetic rotation metaphors, we evaluate precise rotational control with and without haptic feedback for 3D shape manipulation. Our results show that haptics-based rotation algorithms allow for precise motor control in 3D space, also, help reduce hand fatigue. In order to understand precise control in its truest form, we investigate orthopedic surgery training from the point of analyzing bone-drilling tasks. We designed a hybrid physical-virtual simulator for bone-drilling training and collected physical data for analyzing precise drilling action. We also developed a Laplacian based performance metric to help expert surgeons evaluate the resident training progress across successive years of orthopedic residency

    A Body-and-Mind-Centric Approach to Wearable Personal Assistants

    Get PDF
    corecore