1,357 research outputs found

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Functional mimicry of Ruffini receptors with fibre Bragg gratings and deep neural networks enables a bio-inspired large-area tactile-sensitive skin

    Get PDF
    Collaborative robots are expected to physically interact with humans in daily living and the workplace, including industrial and healthcare settings. A key related enabling technology is tactile sensing, which currently requires addressing the outstanding scientific challenge to simultaneously detect contact location and intensity by means of soft conformable artificial skins adapting over large areas to the complex curved geometries of robot embodiments. In this work, the development of a large-area sensitive soft skin with a curved geometry is presented, allowing for robot total-body coverage through modular patches. The biomimetic skin consists of a soft polymeric matrix, resembling a human forearm, embedded with photonic fibre Bragg grating transducers, which partially mimics Ruffini mechanoreceptor functionality with diffuse, overlapping receptive fields. A convolutional neural network deep learning algorithm and a multigrid neuron integration process were implemented to decode the fibre Bragg grating sensor outputs for inference of contact force magnitude and localization through the skin surface. Results of 35 mN (interquartile range 56 mN) and 3.2 mm (interquartile range 2.3 mm) median errors were achieved for force and localization predictions, respectively. Demonstrations with an anthropomorphic arm pave the way towards artificial intelligence based integrated skins enabling safe human–robot cooperation via machine intelligence

    All the Feels: A dexterous hand with large area sensing

    Full text link
    High cost and lack of reliability has precluded the widespread adoption of dexterous hands in robotics. Furthermore, the lack of a viable tactile sensor capable of sensing over the entire area of the hand impedes the rich, low-level feedback that would improve learning of dexterous manipulation skills. This paper introduces an inexpensive, modular, robust, and scalable platform - the DManus- aimed at resolving these challenges while satisfying the large-scale data collection capabilities demanded by deep robot learning paradigms. Studies on human manipulation point to the criticality of low-level tactile feedback in performing everyday dexterous tasks. The DManus comes with ReSkin sensing on the entire surface of the palm as well as the fingertips. We demonstrate effectiveness of the fully integrated system in a tactile aware task - bin picking and sorting. Code, documentation, design files, detailed assembly instructions, trained models, task videos, and all supplementary materials required to recreate the setup can be found on http://roboticsbenchmarks.org/platforms/dmanusComment: 6 pages + references and appendix, 7 figures. Submitted to ICRA 202

    A future of living machines? International trends and prospects in biomimetic and biohybrid systems

    Get PDF
    Research in the fields of biomimetic and biohybrid systems is developing at an accelerating rate. Biomimetics can be understood as the development of new technologies using principles abstracted from the study of biological systems, however, biomimetics can also be viewed from an alternate perspective as an important methodology for improving our understanding of the world we live in and of ourselves as biological organisms. A biohybrid entity comprises at least one artificial (engineered) component combined with a biological one. With technologies such as microscale mobile computing, prosthetics and implants, humankind is moving towards a more biohybrid future in which biomimetics helps us to engineer biocompatible technologies. This paper reviews recent progress in the development of biomimetic and biohybrid systems focusing particularly on technologies that emulate living organisms—living machines. Based on our recent bibliographic analysis [1] we examine how biomimetics is already creating life-like robots and identify some key unresolved challenges that constitute bottlenecks for the field. Drawing on our recent research in biomimetic mammalian robots, including humanoids, we review the future prospects for such machines and consider some of their likely impacts on society, including the existential risk of creating artifacts with significant autonomy that could come to match or exceed humankind in intelligence. We conclude that living machines are more likely to be a benefit than a threat but that we should also ensure that progress in biomimetics and biohybrid systems is made with broad societal consent. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only

    Design and development of robust hands for humanoid robots

    Get PDF
    Design and development of robust hands for humanoid robot

    Connecting YARP to the Web with yarp.js

    Get PDF
    We present yarp.js, a JavaScript framework enabling robotics networks to interface and interact with external devices by exploiting modern Web communication protocols. By connecting a YARP server module with a browser client on any external device, yarp.js allows to access on board sensors using standard Web APIs and stream the acquired data through the yarp.js network without the need for any installation. Communication between YARP modules and yarp.js clients is bi-directional, opening also the possibility for robotics applications to exploit the capabilities of modern browsers to process external data, such as speech synthesis, 3D data visualization, or video streaming to name a few. Yarp.js requires only a browser installed on the client device, allowing for fast and easy deployment of novel applications. The code and sample applications to get started with the proposed framework are available for the community at the yarp.js GitHub repository

    Enabling Force Sensing During Ground Locomotion: A Bio-Inspired, Multi-Axis, Composite Force Sensor Using Discrete Pressure Mapping

    Get PDF
    This paper presents a new force sensor design approach that maps the local sampling of pressure inside a composite polymeric footpad to forces in three axes, designed for running robots. Conventional multiaxis force sensors made of heavy metallic materials tend to be too bulky and heavy to be fitted in the feet of legged robots, and vulnerable to inertial noise upon high acceleration. To satisfy the requirements for high speed running, which include mitigating high impact forces, protecting the sensors from ground collision, and enhancing traction, these stiff sensors should be paired with additional layers of durable, soft materials; but this also degrades the integrity of the foot structure. The proposed foot sensor is manufactured as a monolithic, composite structure composed of an array of barometric pressure sensors completely embedded in a protective polyurethane rubber layer. This composite architecture allows the layers to provide compliance and traction for foot collision while the deformation and the sampled pressure distribution of the structure can be mapped into three axis force measurement. Normal and shear forces can be measured upon contact with the ground, which causes the footpad to deform and change the readings of the individual pressure sensors in the array. A one-time training process using an artificial neural network is all that is necessary to relate the normal and shear forces with the multiaxis foot sensor output. The results show that the sensor can predict normal forces in the Z-axis up to 300 N with a root mean squared error of 0.66% and up to 80 N in the X- and Y-axis. The experiment results demonstrates a proof-of-concept for a lightweight, low cost, yet robust footpad sensor suitable for use in legged robots undergoing ground locomotion.United States. Defense Advanced Research Projects Agency. Maximum Mobility and Manipulation (M3) ProgramSingapore. Agency for Science, Technology and Researc

    Design of a Tactile Sensor for Robot Hands

    Get PDF

    Peripersonal Space in the Humanoid Robot iCub

    Get PDF
    Developing behaviours for interaction with objects close to the body is a primary goal for any organism to survive in the world. Being able to develop such behaviours will be an essential feature in autonomous humanoid robots in order to improve their integration into human environments. Adaptable spatial abilities will make robots safer and improve their social skills, human-robot and robot-robot collaboration abilities. This work investigated how a humanoid robot can explore and create action-based representations of its peripersonal space, the region immediately surrounding the body where reaching is possible without location displacement. It presents three empirical studies based on peripersonal space findings from psychology, neuroscience and robotics. The experiments used a visual perception system based on active-vision and biologically inspired neural networks. The first study investigated the contribution of binocular vision in a reaching task. Results indicated the signal from vergence is a useful embodied depth estimation cue in the peripersonal space in humanoid robots. The second study explored the influence of morphology and postural experience on confidence levels in reaching assessment. Results showed that a decrease of confidence when assessing targets located farther from the body, possibly in accordance to errors in depth estimation from vergence for longer distances. Additionally, it was found that a proprioceptive arm-length signal extends the robot’s peripersonal space. The last experiment modelled development of the reaching skill by implementing motor synergies that progressively unlock degrees of freedom in the arm. The model was advantageous when compared to one that included no developmental stages. The contribution to knowledge of this work is extending the research on biologically-inspired methods for building robots, presenting new ways to further investigate the robotic properties involved in the dynamical adaptation to body and sensing characteristics, vision-based action, morphology and confidence levels in reaching assessment.CONACyT, Mexico (National Council of Science and Technology
    corecore