26 research outputs found
Sensory Communication
Contains table of contents for Section 2 and reports on five research projects.National Institutes of Health Contract 2 R01 DC00117National Institutes of Health Contract 1 R01 DC02032National Institutes of Health Contract 2 P01 DC00361National Institutes of Health Contract N01 DC22402National Institutes of Health Grant R01-DC001001National Institutes of Health Grant R01-DC00270National Institutes of Health Grant 5 R01 DC00126National Institutes of Health Grant R29-DC00625U.S. Navy - Office of Naval Research Grant N00014-88-K-0604U.S. Navy - Office of Naval Research Grant N00014-91-J-1454U.S. Navy - Office of Naval Research Grant N00014-92-J-1814U.S. Navy - Naval Air Warfare Center Training Systems Division Contract N61339-94-C-0087U.S. Navy - Naval Air Warfare Center Training System Division Contract N61339-93-C-0055U.S. Navy - Office of Naval Research Grant N00014-93-1-1198National Aeronautics and Space Administration/Ames Research Center Grant NCC 2-77
Recommended from our members
Hands in the Real World
Robots face a rapidly expanding range of potential applications beyond controlled environments, from remote exploration and search-and-rescue to household assistance and agriculture. The focus of physical interaction is typically delegated to end-effectors -- fixtures, grippers or hands -- as these machines perform manual tasks. Yet, effective deployment of versatile robot hands in the real world is still limited to few examples, despite decades of dedicated research. In this paper we review hands that found application in the field, aiming to discuss open challenges with more articulated designs, discussing novel trends and perspectives. We hope to encourage swift development of capable robotic hands for long-term use in varied real world settings. The first part of the paper centers around progress in artificial hand design, identifying key functions for a variety of environments. The final part focuses on the overall trends in hand mechanics, sensors and control, and how performance and resiliency are qualified for real world deployment
Sensory Communication
Contains table of contents for Section 2, an introduction and reports on fifteen research projects.National Institutes of Health Grant RO1 DC00117National Institutes of Health Grant RO1 DC02032National Institutes of Health Contract P01-DC00361National Institutes of Health Contract N01-DC22402National Institutes of Health/National Institute on Deafness and Other Communication Disorders Grant 2 R01 DC00126National Institutes of Health Grant 2 R01 DC00270National Institutes of Health Contract N01 DC-5-2107National Institutes of Health Grant 2 R01 DC00100U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-94-C-0087U.S. Navy - Office of Naval Research/Naval Air Warfare Center Contract N61339-95-K-0014U.S. Navy - Office of Naval Research/Naval Air Warfare Center Grant N00014-93-1-1399U.S. Navy - Office of Naval Research/Naval Air Warfare Center Grant N00014-94-1-1079U.S. Navy - Office of Naval Research Subcontract 40167U.S. Navy - Office of Naval Research Grant N00014-92-J-1814National Institutes of Health Grant R01-NS33778U.S. Navy - Office of Naval Research Grant N00014-88-K-0604National Aeronautics and Space Administration Grant NCC 2-771U.S. Air Force - Office of Scientific Research Grant F49620-94-1-0236U.S. Air Force - Office of Scientific Research Agreement with Brandeis Universit
Recommended from our members
Multi-touch Interaction Data Analysis System (MIDAS) for 2-D tactile display research
The study of haptic perception and cognition requires data about how humans interact with tactile surfaces in the context of performing cognitive tasks. MIDAS is a set of three tools for the digital capture, coding, analysis, and interpretation of timeseries, multitouch, interactive behaviors on a tactile surface. The MIDAS-logger uses the current screen technology of tablet computers to capture touches (up to ten fingers at high spatial and temporal resolution) through conventional tactile graphics that are overlaid on the screen. The MIDAS-analyser is a software program for the qualitative and quantitative analysis of MIDASlogger touch data, which includes a fully interactive visualization of the data and a yoked display of a conventional simultaneous video recording made of the interactions. MIDAS-tactile protocol analysis (TPA) provides a scheme and a method to enable the rich coding and interpretation of tactile behaviors over multiple spatial and temporal scales. The efficacy of MIDAS was assessed against a set of criteria drawn from the successes and limitations of prior approaches to the study of tactile interactions. To demonstrate the functions of MIDAS, its three components were used to capture, analyze, code, and interpret the behavior of an experienced user and an inexperienced user of tactile graphics as they performed a shape-matching task
A Person-Centric Design Framework for At-Home Motor Learning in Serious Games
abstract: In motor learning, real-time multi-modal feedback is a critical element in guided training. Serious games have been introduced as a platform for at-home motor training due to their highly interactive and multi-modal nature. This dissertation explores the design of a multimodal environment for at-home training in which an autonomous system observes and guides the user in the place of a live trainer, providing real-time assessment, feedback and difficulty adaptation as the subject masters a motor skill. After an in-depth review of the latest solutions in this field, this dissertation proposes a person-centric approach to the design of this environment, in contrast to the standard techniques implemented in related work, to address many of the limitations of these approaches. The unique advantages and restrictions of this approach are presented in the form of a case study in which a system entitled the "Autonomous Training Assistant" consisting of both hardware and software for guided at-home motor learning is designed and adapted for a specific individual and trainer.
In this work, the design of an autonomous motor learning environment is approached from three areas: motor assessment, multimodal feedback, and serious game design. For motor assessment, a 3-dimensional assessment framework is proposed which comprises of 2 spatial (posture, progression) and 1 temporal (pacing) domains of real-time motor assessment. For multimodal feedback, a rod-shaped device called the "Intelligent Stick" is combined with an audio-visual interface to provide feedback to the subject in three domains (audio, visual, haptic). Feedback domains are mapped to modalities and feedback is provided whenever the user's performance deviates from the ideal performance level by an adaptive threshold. Approaches for multi-modal integration and feedback fading are discussed. Finally, a novel approach for stealth adaptation in serious game design is presented. This approach allows serious games to incorporate motor tasks in a more natural way, facilitating self-assessment by the subject. An evaluation of three different stealth adaptation approaches are presented and evaluated using the flow-state ratio metric. The dissertation concludes with directions for future work in the integration of stealth adaptation techniques across the field of exergames.Dissertation/ThesisDoctoral Dissertation Computer Science 201
Effectiveness of Vibration-based Haptic Feedback Effects for 3D Object Manipulation
This research explores the development of vibration-based haptic feedback for a mouse-like computer input device. The haptic feedback is intended to be used in 3D virtual environments to provide users of the environment with information that is difficult to convey visually, such as collisions between objects. Previous research into vibrotactile haptic feedback can generally be split into two broad categories: single tactor handheld devices; and multiple tactor devices that are attached to the body. This research details the development of a vibrotactile feedback device that merges the two categories, creating a handheld device with multiple tactors.
Building on previous research, a prototype device was developed. The device consisted of a semi-sphere with a radius of 34 mm, mounted on a PVC disk with a radius of 34 mm and a height of 18 mm. Four tactors were placed equidistantly about the equator of the PVC disk. Unfortunately, vibrations from a single tactor caused the entire device to shake due to the rigid plastic housing for the tactors. This made it difficult to accurately detect which tactor was vibrating. A second prototype was therefore developed with tactors attached to elastic bands. When a tactor vibrates, the elastic bands dampen the vibration, reducing the vibration in the rest of the device. The goal of the second prototype was to increase the accuracy in localizing the vibrating tactor.
An experiment was performed to compare the two devices. The study participants grasped one of the device prototypes as they would hold a computer mouse. During each trial, a random tactor would vibrate. By pushing a key on the keyboard, the participants indicated when they detected vibration. They then pushed another key to indicate which tactor had been vibrating. The procedure was then repeated for the other device. Detection of the vibration was faster (p < 0.01) and more accurate (p < 0.001) with the soft shell design than with the hard shell design. In a post-experiment questionnaire, participants preferred the soft shell design to the hard shell design.
Based on the results of the experiment, a mould was created for building future prototypes. The mould allows for the rapid creation of devices from silicone. Silicone was chosen as a material because it can easily be moulded and is available in different levels of hardness. The hardness of the silicone can be used to control the amount of damping of the vibrations. To increase the vibration damping, a softer silicone can be used. Several recommendations for future prototypes and experiments are made
Tactile Arrays for Virtual Textures
This thesis describes the development of three new tactile stimulators for active
touch, i.e. devices to deliver virtual touch stimuli to the fingertip in response to
exploratory movements by the user. All three stimulators are designed to provide
spatiotemporal patterns of mechanical input to the skin via an array of contactors,
each under individual computer control. Drive mechanisms are based on
piezoelectric bimorphs in a cantilever geometry.
The first of these is a 25-contactor array (5 × 5 contactors at 2 mm spacing). It
is a rugged design with a compact drive system and is capable of producing strong
stimuli when running from low voltage supplies. Combined with a PC mouse,
it can be used for active exploration tasks. Pilot studies were performed which
demonstrated that subjects could successfully use the device for discrimination of
line orientation, simple shape identification and line following tasks.
A 24-contactor stimulator (6 × 4 contactors at 2 mm spacing) with improved
bandwidth was then developed. This features control electronics designed to transmit
arbitrary waveforms to each channel (generated on-the-fly, in real time) and
software for rapid development of experiments. It is built around a graphics tablet,
giving high precision position capability over a large 2D workspace. Experiments
using two-component stimuli (components at 40 Hz and 320 Hz) indicate that
spectral balance within active stimuli is discriminable independent of overall intensity,
and that the spatial variation (texture) within the target is easier to detect
at 320 Hz that at 40 Hz.
The third system developed (again 6 × 4 contactors at 2 mm spacing) was a lightweight modular stimulator developed for fingertip and thumb grasping tasks;
furthermore it was integrated with force-feedback on each digit and a complex
graphical display, forming a multi-modal Virtual Reality device for the display of
virtual textiles. It is capable of broadband stimulation with real-time generated
outputs derived from a physical model of the fabric surface. In an evaluation study,
virtual textiles generated from physical measurements of real textiles were ranked
in categories reflecting key mechanical and textural properties. The results were
compared with a similar study performed on the real fabrics from which the virtual
textiles had been derived. There was good agreement between the ratings of the
virtual textiles and the real textiles, indicating that the virtual textiles are a good
representation of the real textiles and that the system is delivering appropriate
cues to the user
A high speed sensor system for tactile interaction research
Schürmann C. A high speed sensor system for tactile interaction research. Bielefeld: Bielefeld University Library; 2013.In this work we will describe and implement the first tactile sensor system that combines the properties of modularity with a very high sensing speed, a high sensitivity and a high spatial resolution. This unique combination of features enables researchers to develop novel applications and makes it possible to replace
task specific tactile sensors with a single system.
The very high sensing speed of the system allows for slip detection during robot grasping. And as all our sensor cells are sampled with the same high frequency, our system can even enable the slip detection for multiple contact
points at the same time. This high speed was made possible through the development of a highly integrated parallel sensor sampling architecture.
The modularity of the system allows it to be employed in a multitude of applications. Tactile sensitive surfaces of various dimensions can be easily realized through a very simple ’plug and use’ principle without the need for software configuration by the user. This was made possible by developing a new bus system that allows the relative localization of the participants. Our system can be used to create tactile sensitive table surfaces with a large amount of sensor cells and due to its high speed design still provide for real time frame rates.
The flexibility and high performance of the system enabled us to develop a tactile sensitive object that allows the continuous high speed monitoring of human finger forces. For this we solved the problem of integrating the tactile
sensors to allow free movement of the object, while maintaining a constant high rate of data capture and realizing a low latency synchronization to external
devices.
The high sensitivity of the system was made possible through technical innovation in the state of the art of resistive based tactile sensors. We did so by creating an optimized sensor cell shape and investigating the behavior of different sensor materials. The knowledge gained in this process was further used to advance the existing method of sensor normalization into a real time method.
We will present a range of tactile interaction scenarios that have been realized with the tactile sensor system named Myrmex. These scenarios include the investigating of human grasp force control during a pick and place task, a tactile table for integration into an intelligent household and a tactile table for the manipulation of virtual clay as a form of finger training.
In addition we will present a selection of scenarios where the Myrmex system was employed by other researchers, as in using the sensor modules as (large) tactile fingertips on robot arms to implement tactile servoing or slip detection during object grasping. The system has also been used to study human finger forces as well as investigating novel methods for prosthesis control. The positive results from all the scenarios support our conclusion that the developed Myrmex system is a very valuable and reliable tool for the research of tactile interactions