14 research outputs found

    Design and evaluation of an augmented reality head-mounted display user interface for controlling legged manipulators

    Get PDF
    Designing an intuitive User Interface (UI) for controlling assistive robots remains challenging. Most existing UIs leverage traditional control interfaces such as joysticks, hand-held controllers, and 2D UIs. Thus, users have limited availability to use their hands for other tasks. Furthermore, although there is extensive research regarding legged manipulators, comparatively little is on their UIs. Towards extending the state-of-art in this domain, we provide a user study comparing an Augmented Reality (AR) Head-Mounted Display (HMD) UI we developed for controlling a legged manipulator against off-the-shelf control methods for such robots. We made this comparison baseline across multiple factors relevant to a successful interaction. The results from our user study ( N=17 ) show that although the AR UI increases immersion, off-the-shelf control methods outperformed the AR UI in terms of time performance and cognitive workload. Nonetheless, a follow-up pilot study incorporating the lessons learned shows that AR UIs can outpace hand-held-based control methods and reduce the cognitive requirements when designers include hands-free interactions and cognitive offloading principles into the UI

    Augmented reality eser interfaces for heterogeneous multirobot control

    Get PDF
    Recent advances in the design of head-mounted augmented reality (AR) interfaces for assistive human-robot interaction (HRI) have allowed untrained users to rapidly and fluently control single-robot platforms. In this paper, we investigate how such interfaces transfer onto multirobot architectures, as several assistive robotics applications need to be distributed among robots that are different both physically and in terms of software. As part of this investigation, we introduce a novel head-mounted AR interface for heterogeneous multirobot control. This interface generates and displays dynamic joint-affordance signifiers, i.e. signifiers that combine and show multiple actions from different robots that can be applied simultaneously to an object. We present a user study with 15 participants analysing the effects of our approach on their perceived fluency. Participants were given the task of filling-out a cup with water making use of a multirobot platform. Our results show a clear improvement in standard HRI fluency metrics when users applied dynamic joint-affordance signifiers, as opposed to a sequence of independent actions

    Holo-SpoK: Affordance-aware augmented reality control of legged manipulators

    No full text
    Although there is extensive research regarding legged manipulators, comparatively little focuses on their User Interfaces (UIs). Towards extending the state-of-art in this domain, in this work, we integrate a Boston Dynamics (BD) Spot® with a light-weight 7 DoF Kinova® robot arm and a Robotiq® 2F-85 gripper into a legged manipulator. Furthermore, we jointly control the robotic platform using an affordance-aware Augmented Reality (AR) Head-Mounted Display (HMD) UI developed for the Microsoft HoloLens 2. We named the combined platform Holo-SpoK. Moreover, we explain how this manipulator colocalises with the HoloLens 2 for its control through AR. In addition, we present the details of our algorithms for autonomously detecting grasp-ability affordances and for the refinement of the positions obtained via vision-based colocalisation. We validate the suitability of our proposed methods with multiple navigation and manipulation experiments. To the best of our knowledge, this is the first demonstration of an AR HMD UI for controlling legged manipulators

    Augmented reality control of smart wheelchair using eye-gaze–enabled selection of affordances

    No full text
    In this paper we present a novel augmented reality head mounted display user interface for controlling a robotic wheelchair for people with limited mobility. To lower the cognitive requirements needed to control the wheelchair, we propose integration of a smart wheelchair with an eye-tracking enabled head-mounted display. We propose a novel platform that integrates multiple user interface interaction methods for aiming at and selecting affordances derived by on-board perception capabilities such as laser-scanner readings and cameras. We demonstrate the effectiveness of the approach by evaluating our platform in two realistic scenarios: 1) Door detection, where the affordance corresponds to a Door object and the Go-Through action and 2) People detection, where the affordance corresponds to a Person and the Approach action. To the best of our knowledge, this is the first demonstration of a augmented reality head-mounted display user interface for controlling a smart wheelchair

    Augmented reality controlled smart wheelchair using dynamic signifiers for affordance representation

    No full text
    The design of augmented reality interfaces for people with mobility impairments is a novel area with great potential, as well as multiple outstanding research challenges. In this paper we present an augmented reality user interface for controlling a smart wheelchair with a head-mounted display to provide assistance for mobility restricted people. Our motivation is to reduce the cognitive requirements needed to control a smart wheelchair. A key element of our platform is the ability to control the smart wheelchair using the concepts of affordances and signifiers. In addition to the technical details of our platform, we present a baseline study by evaluating our platform through user-trials of able-bodied individuals and two different affordances: 1) Door Go Through and 2) People Approach. To present these affordances to the user, we evaluated fixed symbol based signifiers versus our novel dynamic signifiers in terms of ease to understand the suggested actions and its relation with the objects. Our results show a clear preference for dynamic signifiers. In addition, we show that the task load reported by participants is lower when controlling the smart wheelchair with our augmented reality user interface compared to using the joystick, which is consistent with their qualitative answers

    The differential interaction of Brucella and Ochrobactrum with innate immunity reveals traits related to the evolution of stealthy pathogens

    No full text
    Background: During evolution, innate immunity has been tuned to recognize pathogen-associated molecular patterns. However, some alpha-Proteobacteria are stealthy intracellular pathogens not readily detected by this system. Brucella members follow this strategy and are highly virulent, but other Brucellaceae like Ochrobactrum are rhizosphere inhabitants and only opportunistic pathogens. To gain insight into the emergence of the stealthy strategy, we compared these two phylogenetically close but biologically divergent bacteria. Methodology/principal Findings: In contrast to Brucella abortus, Ochrobactrum anthropi did not replicate within professional and non-professional phagocytes and, whereas neutrophils had a limited action on B. abortus, they were essential to control O. anthropi infections. O. anthropi triggered proinflammatory responses markedly lower than Salmonella enterica but higher than B. abortus. In macrophages and dendritic cells, the corresponding lipopolysaccharides reproduced these grades of activation, and binding of O. anthropi lipopolysaccharide to the TLR4 co-receptor MD-2 and NF-kappaB induction laid between those of B. abortus and enteric bacteria lipopolysaccharides. These differences correlate with reported variations in lipopolysaccharide core sugars, sensitivity to bactericidal peptides and outer membrane permeability. Conclusions/significance: The results suggest that Brucellaceae ancestors carried molecules not readily recognized by innate immunity, so that non-drastic variations led to the emergence of stealthy intracellular parasites. They also suggest that some critical envelope properties, like selective permeability, are profoundly altered upon modification of pathogen-associated molecular patterns, and that this represents a further adaptation to the host. It is proposed that this adaptive trend is relevant in other intracellular alpha-Proteobacteria like Bartonella, Rickettsia, Anaplasma, Ehrlichia and Wolbachia

    SUCCOR Risk: Design and Validation of a Recurrence Prediction Index for Early-Stage Cervical Cancer

    Get PDF
    Objective: Based on the SUCCOR study database, our primary objective was to identify the independent clinical pathological variables associated with the risk of relapse in patients with stage IB1 cervical cancer who underwent a radical hysterectomy. Our secondary goal was to design and validate a risk predictive index (RPI) for classifying patients depending on the risk of recurrence. Methods: Overall, 1116 women were included from January 2013 to December 2014. We randomly divided our sample into two cohorts: discovery and validation cohorts. The test group was used to identify the independent variables associated with relapse, and with these variables, we designed our RPI. The index was applied to calculate a relapse risk score for each participant in the validation group. Results: A previous cone biopsy was the most significant independent variable that lowered the rate of relapse (odds ratio [OR] 0.31, 95% confidence interval [CI] 0.17–0.60). Additionally, patients with a tumor diameter >2 cm on preoperative imaging assessment (OR 2.15, 95% CI 1.33–3.5) and operated by the minimally invasive approach (OR 1.61, 95% CI 1.00–2.57) were more likely to have a recurrence. Based on these findings, patients in the validation cohort were classified according to the RPI of low, medium, or high risk of relapse, with rates of 3.4%, 9.8%, and 21.3% observed in each group, respectively. With a median follow-up of 58 months, the 5-year disease-free survival rates were 97.2% for the low-risk group, 88.0% for the medium-risk group, and 80.5% for the high-risk group (p < 0.001). Conclusion: Previous conization to radical hysterectomy was the most powerful protective variable of relapse. Our risk predictor index was validated to identify patients at risk of recurrence
    corecore