57,735 research outputs found

    Brain network for small-scale features in active touch

    Full text link
    An important tactile function is the active detection of small-scale features, such as edges or asperities, which depends on fine hand motor control. Using a resting-state fMRI paradigm, we sought to identify the functional connectivity of the brain network engaged in mapping tactile inputs to and from regions engaged in motor preparation and planning during active touch. Human participants actively located small-scale tactile features that were rendered by a computer-controlled tactile display. To induce rapid perceptual learning, the contrast between the target and the surround was reduced whenever a criterion level of success was achieved, thereby raising the task difficulty. Multiple cortical and subcortical neural connections within a parietal-cerebellar-frontal network were identified by correlating behavioral performance with changes in functional connectivity. These cortical areas reflected perceptual, cognitive, and attention-based processes required to detect and use small-scale tactile features for hand dexterity

    Graphical visualization of contact forces and hand movements during in-hand manipulation

    Get PDF
    © 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksThe paper presents a tool to graphically display the contact forces applied by the fingers of a robotic hand when doing grasping and in-hand manipulation. The forces are computed in two ways, on one side, directly using the measurements of tactile forces in the fingertips and, on the other, using the torques applied by the motors in the finger joints. The implemented tool also allows to command and move the real robotic hand by specifying the complete hand configuration or any single joint, and see graphically the hand simulation. Real results are shown using the Allegro hand with tactile sensors WTS-FT.Peer ReviewedPostprint (published version

    Feeling what you hear: tactile feedback for navigation of audio graphs

    Get PDF
    Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical data, but the current methods of accessing them are highly visually mediated. Representing data using audio feedback is a common method of making data more accessible, but methods of navigating and accessing the data are often serial in nature and laborious. Tactile or haptic displays could be used to provide additional feedback to support a point-and-click type interaction for the visually impaired. A requirements capture conducted with sight impaired computer users produced a review of current accessibility technologies, and guidelines were extracted for using tactile feedback to aid navigation. The results of a qualitative evaluation with a prototype interface are also presented. Providing an absolute position input device and tactile feedback allowed the users to explore the graph using tactile and proprioceptive cues in a manner analogous to point-and-click techniques

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Two-handed navigation in a haptic virtual environment

    Get PDF
    This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues

    Research issues in implementing remote presence in teleoperator control

    Get PDF
    The concept of remote presence in telemanipulation is presented. A conceptual design of a prototype teleoperator system incorporating remote presence is described. The design is presented in functional terms, sensor, display, and control subsystem. An intermediate environment, in which the human operator is made to feel present, is explicated. The intermediate environment differs from the task environment due to the quantity and type of information presented to an operator and due to scaling factors protecting the operator from the hazards of the task environment. Potential benefits of remote presence systems, both for manipulation and for the study of human cognition and preception are discussed

    Ghost hand: My hand is not mine

    Get PDF
    Synchronous visuo-tactile stimulation of the type in the rubber hand illusion (RHI)^1-3^ and in out of body experience (OBE)^4,5^ can induce the brain to incorporate external objects or images into a part or whole of body image. Whether in the context of RHI or OBE, since the participant passively receives visuo-tactile stimulations, body image appears only with the sense of ownership (SoO), not with the sense of agency (the registration that we are the initiators of our actions; SoA)^6,7^. Insofar as self-consciousness as a body image is a unity acting in its environments, body image has to be investigated in the relationship between SoO and SoA^8,9^. It requires an experimental condition in which SoO and SoA can be independently separated in an active condition. However, no experimental condition that is opposite to RHI and OBE in which a subject can feel SoA but not SoO has been proposed to date^10^. Here, we show that a person loses SoO for his own hand that he can freely move by his own will when he sees himself in a lateral view through a head mounted display. It was previously thought that SoO can be represented by synchronous inter-modal stimulations^10^, and that SoO appears to be complemented by SoA11. Our findings show that SoO can be lost under a synchronous visuo-proprioceptive condition while SoA can be maintained. SoO and SoA are two aspects of body representation, and similar dissociations have been proposed in various contexts, such as body image and body schema^12,13^, and 'Acting I' and 'Mine'^14^. Our result suggests that the two-centric-self consisting of SoA and SoO can enhance dynamically robust self-consciousness
    corecore