82 research outputs found
Patient-Specific Prosthetic Fingers by Remote Collaboration - A Case Study
The concealment of amputation through prosthesis usage can shield an amputee
from social stigma and help improve the emotional healing process especially at
the early stages of hand or finger loss. However, the traditional techniques in
prosthesis fabrication defy this as the patients need numerous visits to the
clinics for measurements, fitting and follow-ups. This paper presents a method
for constructing a prosthetic finger through online collaboration with the
designer. The main input from the amputee comes from the Computer Tomography
(CT) data in the region of the affected and the non-affected fingers. These
data are sent over the internet and the prosthesis is constructed using
visualization, computer-aided design and manufacturing tools. The finished
product is then shipped to the patient. A case study with a single patient
having an amputated ring finger at the proximal interphalangeal joint shows
that the proposed method has a potential to address the patient's psychosocial
concerns and minimize the exposure of the finger loss to the public.Comment: Open Access articl
Action planning with two-handed tools
In tool use, the intended external goals have to be transformed into bodily movements by taking into account the target-to-movement mapping implemented by the tool. In bimanual tool use, this mapping may depend on the part of the tool that is operated and the effector used (e.g. the left and right hand at the handle bar moving in opposite directions in order to generate the same bicycle movement). In our study, we investigated whether participants represent the behaviour of the tool or only the effector-specific mapping when using two-handed tools. In three experiments, participants touched target locations with a two-jointed lever, using either the left or the right hand. In one condition, the joint of the lever was constant and switching between hands was associated with switching the target-to-movement-mapping, whereas in another condition, switching between hands was associated with switching the joint, but the target-to-movement-mapping remained constant. Results indicate pronounced costs of switching hands in the condition with constant joint, whereas they were smaller with constant target-to-movement mapping. These results suggest that participants have tool-independent representations of the effector-specific mappings
The effects of instrumental action on perceptual hand maps
Perceiving the external spatial location of body parts using position sense requires that immediate proprioceptive afferent signals be integrated with information about body size and shape. Longo and Haggard (Proc Natl Acad Sci USA 107:11727–11732, 2010) developed a method to measure perceptual hand maps reflecting this metric information about body size and shape. In this paradigm, participants indicate the perceived location of landmarks on their occluded hand by pointing with a long baton held in their other hand. By comparing the relative location of judgments of different hand landmarks, perceptual hand maps can be constructed and compared to actual hand structure. The maps show large and highly stereotyped distortions. Here, I investigated the potential effect of biases related to active motor control of the hand doing the pointing in these distortions. Participants localized the fingertip and knuckle of each finger on their occluded left hand either by actively pointing with a baton held in their right hand (pointing condition) or by giving verbal commands to an experimenter on how to move the baton (verbal condition). Similar distortions were clearly apparent in both conditions, suggesting that they are not an artifact of motor control biases related to the pointing hand
Human muscle spindles act as forward sensory models.
Modern theories of motor control incorporate forward models that combine sensory information and motor commands to predict future sensory states. Such models circumvent unavoidable neural delays associated with on-line feedback control. Here we show that signals in human muscle spindle afferents during unconstrained wrist and finger movements predict future kinematic states of their parent muscle. Specifically, we show that the discharges of type Ia afferents are best correlated with the velocity of length changes in their parent muscles approximately 100-160 ms in the future and that their discharges vary depending on motor sequences in a way that cannot be explained by the state of their parent muscle alone. We therefore conclude that muscle spindles can act as "forward sensory models": they are affected both by the current state of their parent muscle and by efferent (fusimotor) control, and their discharges represent future kinematic states. If this conjecture is correct, then sensorimotor learning implies learning how to control not only the skeletal muscles but also the fusimotor system
In what sense does 'nothing make sense except in the light of evolution'?
Dobzhansky argued that biology only makes sense if life on earth has a shared history. But his dictum is often reinterpreted to mean that biology only makes sense in the light of adaptation. Some philosophers of science have argued in this spirit that all work in ‘proximal’ biosciences such as anatomy, physiology and molecular biology must be framed, at least implicitly, by the selection histories of the organisms under study. Others have denied this and have proposed non-evolutionary ways in which biologists can frame these investigations. This paper argues that an evolutionary perspective is indeed necessary, but that it must be a forward-looking perspective informed by a general understanding of the evolutionary process, not a backward-looking perspective informed by the specific evolutionary history of the species being studied. Interestingly, it turns out that there are aspects of proximal biology that even a creationist cannot study except in the light of a theory of their effect on future evolutio
A data-driven approach to remote tactile interaction: From a BioTac sensor to any fingertip cutaneous device
This paper presents a novel approach to remote tactile interaction, wherein a human uses a telerobot to touch a remote environment. The proposed system consists of a BioTac tactile sensor, in charge of registering contact deformations, and a custom cutaneous device, in charge of applying those deformations to the user’s fingertip via a 3-DoF mobile platform. We employ a novel data-driven algorithm to directly map the BioTac’s sensed stimuli to input commands for the cutaneous device’s motors, without using any kind of skin deformation model. We validated the proposed approach by carrying out a remote tactile interaction experiment. Although this work employed a specific cutaneous device, the experimental protocol and algorithm are valid for any similar display
- …
