11 research outputs found

    Auditory display for improving free-hand gesture interaction

    No full text
    Free-hand gesture recognition technologies allow touchless interaction with a range of applications. However, touchless interaction concepts usually only provide primary, visual feedback on a screen. The lack of secondary tactile feedback, such as that of pressing a key or clicking a mouse, in interaction with free-hand gestures is one reason that such techniques have not been adopted as a standard means of input. This work explores the use of auditory display to improve free-hand gestures. Gestures using a Leap motion controller were augmented with auditory icons and continuous, model-based sonification. Three concepts were generated and evaluated using a sphere-selection task and a video frame selection task. The user experience of the participants was evaluated using NASA TLX and QUESI questionnaires. Results show that the combination of auditory and visual display outperform both purely auditory and purely visual displays in terms of subjective workload and performance measures

    Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review

    No full text

    Visual navigation support for liver applicator placement using interactive map displays

    No full text
    Navigated placement of an ablation applicator in liver surgery would benefit from an effective intraoperative visualization of delicate 3D anatomical structures. In this paper, we propose an approach that facilitates surgery with an interactive as well as an animated map display to support navigated applicator placement in the liver. By reducing the visual complexity of 3D anatomical structures, we provide only the most important information on and around a planned applicator path. By employing different illustrative visualization techniques, the applicator path and its surrounding critical structures, such as blood vessels, are clearly conveyed in an unobstructed way. To retain contextual information around the applicator path and its tip, we desaturate these structures with increasing distance. To alleviate time-consuming and tedious interaction during surgery, our visualization is controlled solely by the position and orientation of a tracked applicator. This enables a direct interaction with the map display without interruption of the intervention. Based on our requirement analysis, we conducted a pilot study with eleven participants and an interactive user study with six domain experts to assess the task completion time, error rate, visual parameters and the usefulness of the animation. The outcome of our pilot study shows that our map display facilitates significantly faster decision making (11.8 s vs. 40.9 s) and significantly fewer false assessments of structures at risk (7.4 % vs. 10.3 %) compared to a currently employed 3D visualization. Furthermore, the animation supports timely perception of the course and depth of upcoming blood vessels, and helps to detect possible areas at risk along the path in advance. Hence, the obtained results demonstrate that our proposed interactive map displays exhibit potential to improve the outcome of navigated liver interventions

    Auditory feedback to support image-guided medical needle placement

    No full text
    During medical needle placement using image-guided navigation systems, the clinician must concentrate on a screen. To reduce the clinician's visual reliance on the screen, this work proposes an auditory feedback method as a stand-alone method or to support visual feedback for placing the navigated medical instrument, in this case a needle. An auditory synthesis model using pitch comparison and stereo panning parameter mapping was developed to augment or replace visual feedback for navigated needle placement. In contrast to existing approaches which augment but still require a visual display, this method allows view-free needle placement. An evaluation with 12 novice participants compared both auditory and combined audiovisual feedback against existing visual methods. Using combined audiovisual display, participants show similar task completion times and report similar subjective workload and accuracy while viewing the screen less compared to using the conventional visual method. The auditory feedback leads to higher task completion times and subjective workload compared to both combined and visual feedback. Audiovisual feedback shows promising results and establishes a basis for applying auditory feedback as a supplement to visual information to other navigated interventions, especially those for which viewing a patient is beneficial or necessary

    Optimized approach for the identification of highly efficient correctors of nonsense mutations in human diseases

    No full text
    About 10% of patients with a genetic disease carry a nonsense mutation causing their pathology. A strategy for correcting nonsense mutations is premature termination codon (PTC) readthrough, i.e. incorporation of an amino acid at the PTC position during translation. PTC-readthrough-activating molecules appear as promising therapeutic tools for these patients. Unfortunately, the molecules shown to induce PTC readthrough show low efficacy, probably because the mRNAs carrying a nonsense mutation are scarce, as they are also substrates of the quality control mechanism called nonsense-mediated mRNA decay (NMD). The screening systems previously developed to identify readthrough-promoting molecules used cDNA constructs encoding mRNAs immune to NMD. As the molecules identified were not selected for the ability to correct nonsense mutations on NMD-prone PTC-mRNAs, they could be unsuitable for the context of nonsense-mutation-linked human pathologies. Here, a screening system based on an NMD-prone mRNA is described. It should be suitable for identifying molecules capable of efficiently rescuing the expression of human genes harboring a nonsense mutation. This system should favor the discovery of candidate drugs for treating genetic diseases caused by nonsense mutations. One hit selected with this screening system is presented and validated on cells from three cystic fibrosis patients
    corecore