2,874 research outputs found
Tele-operated high speed anthropomorphic dextrous hands with object shape and texture identification
This paper reports on the development of two number of robotic hands have been developed which focus on tele-operated high speed anthropomorphic dextrous robotic hands. The aim of developing these hands was to achieve a system that seamlessly interfaced between humans and robots. To provide sensory feedback, to a remote operator tactile sensors were developed to be mounted on the robotic hands. Two systems were developed, the first, being a skin sensor capable of shape reconstruction placed on the palm of the hand to feed back the shape of objects grasped and the second is a highly sensitive tactile array for surface texture identification
Biosensing and Actuation—Platforms Coupling Body Input-Output Modalities for Affective Technologies
Research in the use of ubiquitous technologies, tracking systems and wearables within
mental health domains is on the rise. In recent years, affective technologies have gained
traction and garnered the interest of interdisciplinary fields as the research on such technologies
matured. However, while the role of movement and bodily experience to affective experience is
well-established, how to best address movement and engagement beyond measuring cues and signals
in technology-driven interactions has been unclear. In a joint industry-academia effort, we aim to
remodel how affective technologies can help address body and emotional self-awareness. We present
an overview of biosignals that have become standard in low-cost physiological monitoring and show
how these can be matched with methods and engagements used by interaction designers skilled in
designing for bodily engagement and aesthetic experiences. Taking both strands of work together offers
unprecedented design opportunities that inspire further research. Through first-person soma design,
an approach that draws upon the designer’s felt experience and puts the sentient body at the forefront,
we outline a comprehensive work for the creation of novel interactions in the form of couplings that
combine biosensing and body feedback modalities of relevance to affective health. These couplings lie
within the creation of design toolkits that have the potential to render rich embodied interactions to
the designer/user. As a result we introduce the concept of “orchestration”. By orchestration, we refer
to the design of the overall interaction: coupling sensors to actuation of relevance to the affective
experience; initiating and closing the interaction; habituating; helping improve on the users’ body
awareness and engagement with emotional experiences; soothing, calming, or energising, depending
on the affective health condition and the intentions of the designer. Through the creation of a
range of prototypes and couplings we elicited requirements on broader orchestration mechanisms.
First-person soma design lets researchers look afresh at biosignals that, when experienced through
the body, are called to reshape affective technologies with novel ways to interpret biodata, feel it,
understand it and reflect upon our bodies
A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.
Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden
Dynamic Facial Prosthetics for Sufferers of Facial Paralysis
BackgroundThis paper discusses the various methods and the materialsfor the fabrication of active artificial facial muscles. Theprimary use for these will be the reanimation of paralysedor atrophied muscles in sufferers of non-recoverableunilateral facial paralysis.MethodThe prosthetic solution described in this paper is based onsensing muscle motion of the contralateral healthy musclesand replicating that motion across a patient’s paralysed sideof the face, via solid state and thin film actuators. Thedevelopment of this facial prosthetic device focused onrecreating a varying intensity smile, with emphasis ontiming, displacement and the appearance of the wrinklesand folds that commonly appear around the nose and eyesduring the expression.An animatronic face was constructed with actuations beingmade to a silicone representation musculature, usingmultiple shape-memory alloy cascades. Alongside theartificial muscle physical prototype, a facial expressionrecognition software system was constructed. This formsthe basis of an automated calibration and reconfigurationsystem for the artificial muscles following implantation, soas to suit the implantee’s unique physiognomy.ResultsAn animatronic model face with silicone musculature wasdesigned and built to evaluate the performance of ShapeMemory Alloy artificial muscles, their power controlcircuitry and software control systems. A dual facial motionsensing system was designed to allow real time control overmodel – a piezoresistive flex sensor to measure physicalmotion, and a computer vision system to evaluate real toartificial muscle performance.Analysis of various facial expressions in real subjects wasmade, which give useful data upon which to base thesystems parameter limits.ConclusionThe system performed well, and the various strengths andshortcomings of the materials and methods are reviewedand considered for the next research phase, when newpolymer based artificial muscles are constructed andevaluated.Key WordsArtificial Muscles, facial prosthetics, stroke rehabilitation,facial paralysis, computer vision, automated facialrecognition
Ubiquitous haptic feedback in human-computer interaction through electrical muscle stimulation
[no abstract
Robotics of human movements
The construction of robotic systems that can move the way humans do, with respect to agility, stability and precision, is a necessary prerequisite for the successful integration of robotic systems in human environments. We explain human-centered views on robotics, based on the three basic ingredients (1) actuation; (2) sensing; and (3) control, and formulate detailed examples thereof
- …