720 research outputs found

    Biomechanical Texture Coding and Transmission of Texture Information in Rat Whiskers

    Get PDF
    Classically, texture discrimination has been thought to be based on ‘global’ codes, i.e. frequency (signal analysis based on Fourier analysis) or intensity (signal analysis based on averaging), which both rely on integration of the vibrotactile signal across time and/or space. Recently, a novel ‘local’ coding scheme based on the waveform of frictional movements, discrete short- lasting kinematic events (i.e. stick-slip movements called slips) has been formulated. In the first part of my study I performed biomechanical measurements of relative movements of a rat vibrissa across sandpapers of different roughness. My major finding is that the classic global codes convey some information about texture identity but are consistently outperformed by the slip-based local code. Moreover, the slip code also surpasses the global ones in coding for active scanning parameters. This is remarkable as it suggests that the slip code would explicitly allow the whisking rat to optimize perception by selecting goal-specific scanning strategies. I therefore provide evidence that short stick-slip events may contribute to the perceptual mechanism by which rodent vibrissa code surface roughness. In the second part, I studied the biomechanics of how such events are transmitted from tip to follicle where mechano-transduction occurs. For this purpose, ultra-fast videography recording of the entire beam of a plucked rat whisker rubbing across sandpaper was employed. I found that slip events are conveyed almost instantly from tip to follicle while amplifying moments by a factor of about 1000. From these results, I argue that the mechanics of the whisker serve as a passive amplification device that faithfully represents stick-slip events to the neuronal receptors. Using measures of correlation, I moreover found that amongst the kinematic 8 variables, acceleration portrays dynamic variables (forces) best. The time series of acceleration at the base of the whisker provided a fair proxy to the time series of forces (dynamical variables) acting on the whisker base. Acceleration measurements (easily done via videography) may therefore provide an access to at least the relative amplitude of forces. This may be important for future work in behaving animals, where dynamical variables are notoriously difficult to measure

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2, an introduction and reports on fifteen research projects.National Institutes of Health Grant RO1 DC00117National Institutes of Health Grant RO1 DC02032National Institutes of Health Contract P01-DC00361National Institutes of Health Contract N01-DC22402National Institutes of Health/National Institute on Deafness and Other Communication Disorders Grant 2 R01 DC00126National Institutes of Health Grant 2 R01 DC00270National Institutes of Health Contract N01 DC-5-2107National Institutes of Health Grant 2 R01 DC00100U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-94-C-0087U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-95-K-0014U.S. Navy - Office of Naval Research/Naval Air Warfare Center Grant N00014-93-1-1399U.S. Navy - Office of Naval Research/Naval Air Warfare Center Grant N00014-94-1-1079U.S. Navy - Office of Naval Research Subcontract 40167U.S. Navy - Office of Naval Research Grant N00014-92-J-1814National Institutes of Health Grant R01-NS33778U.S. Navy - Office of Naval Research Grant N00014-88-K-0604National Aeronautics and Space Administration Grant NCC 2-771U.S. Air Force - Office of Scientific Research Grant F49620-94-1-0236U.S. Air Force - Office of Scientific Research Agreement with Brandeis Universit

    Human Inspired Multi-Modal Robot Touch

    Get PDF

    How do humans mediate with the external physical world? From perception to control of articulated objects

    Get PDF
    Many actions in our daily life involve operation with articulated tools. Despite the ubiquity of articulated objects in daily life, human ability in perceiving the properties and control of articulated objects has been merely studied. Articulated objects are composed of links and revolute or prismatic joints. Moving one part of the linkage results in the movement of the other ones. Reaching a position with the tip of a tool requires adapting the motor commands to the change of position of the endeffector different from the action of reaching the same position with the hand. The dynamic properties are complex and variant in the movement of articulated bodies. For instance, apparent mass, a quantity that measures the dynamic interaction of the articulated object, varies as a function of the changes in configuration. An actuated articulated system can generate a static, but position-dependent force field with constant torques about joints. There are evidences that internal models are involved in the perception and control of tools. In the present work, we aim to investigate several aspects of the perception and control of articulated objects and address two questions, The first question is how people perceive the kinematic and dynamic properties in the haptic interaction with articulated objects? And the second question is what effect has seeing the tool on the planning and execution of reaching movements with a complex tool? Does the visual representation of mechanism structures help in the reaching movement and how? To address these questions, 3D printed physical articulated objects and robotic systems have been designed and developed for the psychophysical studies. The present work involves three studies in different aspects of perception and control of articulated objects. We first did haptic size discrimination tasks using three different types of objects, namely, wooden boxes, actuated apparatus with two movable flat surfaces, and large-size pliers, in unimanual, bimanual grounded and bimanual free conditions. We found bimanual integration occurred in particular in the free manipulation of objects. The second study was on the visuo-motor reaching with complex tools. We found that seeing the mechanism of the tool, even briefly at the beginning of the trial, improved the reaching performance. The last study was about force perception, evidences showed that people could take use of the force field at the end-effector to induce the torque about the joints generated by the articulated system

    Doctor of Philosophy

    Get PDF
    dissertationVirtual environments provide a consistent and relatively inexpensive method of training individuals. They often include haptic feedback in the form of forces applied to a manipulandum or thimble to provide a more immersive and educational experience. However, the limited haptic feedback provided in these systems tends to be restrictive and frustrating to use. Providing tactile feedback in addition to this kinesthetic feedback can enhance the user's ability to manipulate and interact with virtual objects while providing a greater level of immersion. This dissertation advances the state-of-the-art by providing a better understanding of tactile feedback and advancing combined tactilekinesthetic systems. The tactile feedback described within this dissertation is provided by a finger-mounted device called the contact location display (CLD). Rather than displaying the entire contact surface, the device displays (feeds back) information only about the center of contact between the user's finger and a virtual surface. In prior work, the CLD used specialized two-dimensional environments to provide smooth tactile feedback. Using polygonal environments would greatly enhance the device's usefulness. However, the surface discontinuities created by the facets on these models are rendered through the CLD, regardless of traditional force shading algorithms. To address this issue, a haptic shading algorithm was developed to provide smooth tactile and kinesthetic interaction with general polygonal models. Two experiments were used to evaluate the shading algorithm. iv To better understand the design requirements of tactile devices, three separate experiments were run to evaluate the perception thresholds for cue localization, backlash, and system delay. These experiments establish quantitative design criteria for tactile devices. These results can serve as the maximum (i.e., most demanding) device specifications for tactile-kinesthetic haptic systems where the user experiences tactile feedback as a function of his/her limb motions. Lastly, a revision of the CLD was constructed and evaluated. By taking the newly evaluated design criteria into account, the CLD device became smaller and lighter weight, while providing a full two degree-of-freedom workspace that covers the bottom hemisphere of the finger. Two simple manipulation experiments were used to evaluate the new CLD device

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Sensing the Environment With Whiskers

    Get PDF

    Prediction of Choice from Competing Mechanosensory and Choice-Memory Cues during Active Tactile Decision Making

    Get PDF
    Perceptual decision making is an active process where animals move their sense organs to extract task-relevant information. To investigate how the brain translates sensory input into decisions during active sensation, we developed a mouse active touch task where the mechanosensory input can be precisely measured and that challenges animals to use multiple mechanosensory cues. Male mice were trained to localize a pole using a single whisker and to report their decision by selecting one of three choices. Using high-speed imaging and machine vision, we estimated whisker–object mechanical forces at millisecond resolution. Mice solved the task by a sensory-motor strategy where both the strength and direction of whisker bending were informative cues to pole location. We found competing influences of immediate sensory input and choice memory on mouse choice. On correct trials, choice could be predicted from the direction and strength of whisker bending, but not from previous choice. In contrast, on error trials, choice could be predicted from previous choice but not from whisker bending. This study shows that animal choices during active tactile decision making can be predicted from mechanosensory and choice-memory signals, and provides a new task well suited for the future study of the neural basis of active perceptual decisions

    Active haptic perception in robots: a review

    Get PDF
    In the past few years a new scenario for robot-based applications has emerged. Service and mobile robots have opened new market niches. Also, new frameworks for shop-floor robot applications have been developed. In all these contexts, robots are requested to perform tasks within open-ended conditions, possibly dynamically varying. These new requirements ask also for a change of paradigm in the design of robots: on-line and safe feedback motion control becomes the core of modern robot systems. Future robots will learn autonomously, interact safely and possess qualities like self-maintenance. Attaining these features would have been relatively easy if a complete model of the environment was available, and if the robot actuators could execute motion commands perfectly relative to this model. Unfortunately, a complete world model is not available and robots have to plan and execute the tasks in the presence of environmental uncertainties which makes sensing an important component of new generation robots. For this reason, today\u2019s new generation robots are equipped with more and more sensing components, and consequently they are ready to actively deal with the high complexity of the real world. Complex sensorimotor tasks such as exploration require coordination between the motor system and the sensory feedback. For robot control purposes, sensory feedback should be adequately organized in terms of relevant features and the associated data representation. In this paper, we propose an overall functional picture linking sensing to action in closed-loop sensorimotor control of robots for touch (hands, fingers). Basic qualities of haptic perception in humans inspire the models and categories comprising the proposed classification. The objective is to provide a reasoned, principled perspective on the connections between different taxonomies used in the Robotics and human haptic literature. The specific case of active exploration is chosen to ground interesting use cases. Two reasons motivate this choice. First, in the literature on haptics, exploration has been treated only to a limited extent compared to grasping and manipulation. Second, exploration involves specific robot behaviors that exploit distributed and heterogeneous sensory data

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use
    corecore