49,612 research outputs found

    OPTICAL-BASED TACTILE SENSORS FOR MINIMALLY INVASIVE SURGERIES: DESIGN, MODELING, FABRICATION AND VALIDATION

    Get PDF
    Loss of tactile perception is the most challenging limitation of state-of-the-art technology for minimally invasive surgery. In conventional open surgery, surgeons rely on their tactile sensation to perceive the tissue type, anatomical landmarks, and instrument-tissue interaction in the patient’s body. To compensate for the loss of tactile feedback in minimally invasive surgery, researchers have proposed various tactile sensors based on electrical and optical sensing principles. Optical-based sensors have shown the most compatibility with the functional and physical requirements of minimally invasive surgery applications. However, the proposed tactile sensors in the literature are typically bulky, expensive, cumbersome to integrate with surgical instruments and show nonlinearity in interaction with biological tissues. In this doctoral study, different optical tactile sensing principles were proposed, modeled, validated and various tactile sensors were fabricated, and experimentally studied to address the limitations of the state-of-the-art. The present thesis first provides a critical review of the proposed tactile sensors in the literature with a comparison of their advantages and limitations for surgical applications. Afterward, it compiles the results of the design, modeling, and validation of a hybrid optical-piezoresistive sensor, a distributed Bragg reflecting sensor, and two sensors based on the variable bending radius light intensity modulation principle. The performance of each sensor was verified experimentally for the required criteria of accuracy, resolution, range, repeatability, and hysteresis. Also, a novel image-based intensity estimation technique was proposed and its applicability for being used in surgical applications was verified experimentally. In the end, concluding remarks and recommendations for future studies are provided

    Feeling what you hear: tactile feedback for navigation of audio graphs

    Get PDF
    Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical data, but the current methods of accessing them are highly visually mediated. Representing data using audio feedback is a common method of making data more accessible, but methods of navigating and accessing the data are often serial in nature and laborious. Tactile or haptic displays could be used to provide additional feedback to support a point-and-click type interaction for the visually impaired. A requirements capture conducted with sight impaired computer users produced a review of current accessibility technologies, and guidelines were extracted for using tactile feedback to aid navigation. The results of a qualitative evaluation with a prototype interface are also presented. Providing an absolute position input device and tactile feedback allowed the users to explore the graph using tactile and proprioceptive cues in a manner analogous to point-and-click techniques

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    EEG Signal Processing and Classification for the Novel Tactile-Force Brain-Computer Interface Paradigm

    Full text link
    The presented study explores the extent to which tactile-force stimulus delivered to a hand holding a joystick can serve as a platform for a brain computer interface (BCI). The four pressure directions are used to evoke tactile brain potential responses, thus defining a tactile-force brain computer interface (tfBCI). We present brain signal processing and classification procedures leading to successful interfacing results. Experimental results with seven subjects performing online BCI experiments provide a validation of the hand location tfBCI paradigm, while the feasibility of the concept is illuminated through remarkable information-transfer rates.Comment: 6 pages (in conference proceedings original version); 6 figures, submitted to The 9th International Conference on Signal Image Technology & Internet Based Systems, December 2-5, 2013, Kyoto, Japan; to be available at IEEE Xplore; IEEE Copyright 201

    An Introduction to 3D User Interface Design

    Get PDF
    3D user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of three-dimensional (3D) interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3D tasks and the use of traditional two-dimensional interaction styles in 3D environments. We divide most user interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques, but also practical guidelines for 3D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3D interaction design, and some example applications with complex 3D interaction requirements. We also present an annotated online bibliography as a reference companion to this article

    The Iterative Development of the Humanoid Robot Kaspar: An Assistive Robot for Children with Autism

    Get PDF
    This paper gives an overview of the design and development of the humanoid robot Kaspar. Since the first Kaspar robot was developed in 2005, the robotic platform has undergone continuous development driven by the needs of users and technological advancements enabling the integration of new features. We discuss in detail the iterative development of Kaspar’s design and clearly explain the rational of each development, which has been based on the user requirements as well as our years of experience in robot assisted therapy for children with autism, particularly focusing on how the developments benefit the children we work with. Further to this, we discuss the role and benefits of robotic autonomy on both children and therapist along with the progress that we have made on the Kaspar robot’s autonomy towards achieving a semi-autonomous child-robot interaction in a real world setting.Peer reviewe
    • …
    corecore