101 research outputs found

    Development of Sensing Systems for Improving Surgical Grasper Performance

    Get PDF
    Minimally invasive techniques play a vital and increasing role in modern surgery. In these procedures, surgical graspers are essential in replacing the surgeon’s fingertips as the main manipulator of delicate soft tissues. Current graspers lack haptic feedback, restricting the surgeon to visual feedback. Studies show that this can frequently lead to morbidity or task errors due to inappropriate application of force. Existing research has sought to address these concerns and improve the safety and performance of grasping through the provision of haptic feedback to the surgeon. However, an effective method of grasping task optimisation has not been found. This thesis explores new sensing approaches intended to reduce errors when manipulating soft tissues, and presents a novel tactile sensor designed for deployment in the grasper jaw. The requirements were first established through discussion with clinical partners and a literature review. This resulted in a conceptual approach to use multi-axis tactile sensing within the grasper jaw as a potential novel solution. As a foundation to the research, a study was conducted using instrumented graspers to investigate the characteristics of grasp force employed by surgeons of varying skill levels. The prevention of tissue slip was identified as a key method in the prevention of grasper misuse, preventing both abrasion through slip and crush damage. To detect this phenomena, a novel method was proposed based on an inductive pressure sensing system. To investigate the efficacy of this technique, experimental and computational modelling investigations were conducted. Computational models were used to better understand the transducer mechanisms, to optimise sensor geometry and to evaluate performance in slip detection. Prototype sensors were then fabricated and experimentally evaluated for their ultimate use in slip detection within a surgical grasper. The work concludes by considering future challenges to clinical translation and additional opportunities for this research in different domains

    Parametric mechanical design and optimisation of the Canterbury Hand.

    Get PDF
    As part of worldwide research humanoid robots have been developed for household, industrial and exploratory applications. If such robots are to interact with people and human created environments they will require human-like hands. The objective of this thesis was the parametric design and optimisation of a dexterous, and anthropomorphic robotic end effector. Known as the ‘Canterbury Hand’ it has 11 degree of freedoms with four fingers and a thumb. The hand has applications for dexterous teleoperation and object manipulation in industrial, hazardous or uncertain environments such as orbital robotics. The human hand was analysed so that the Canterbury Hand could copy its motions, appearance and grasp types. An analysis of the current literature on experimental prosthetic and robotic hands was also carried out. A disadvantage of many of these hand designs was that they were remotely powered using large, heavy actuator packs. The advantage of the Canterbury Hand is that it has been designed to hold the motors, wires, and circuit boards entirely within itself; although a belt carried battery pack is required. The hand was modelled using a parametric 3D computer aided design (CAD) program. Two different configurations of the hand were created in the model. One configuration, as a dexterous robot hand, used Ø13mm 3 Watt DC motors, while the other used Ø10mm, 0.5 Watt DC motors (although this hand is still slightly too large for a general prosthesis). The parts within the hand were modelled to permit changes to the geometry. This was necessary for the optimisation process. The bearing geometry of the finger and thumb linkages, as well as the thumb rotation axis was optimised for anthropomorphic motion, appearance and increased force output. A design table within a spreadsheet was created to interact with the CAD models of the hand to quickly implement the optimised geometry. The work reported in this thesis has shown the possibilities for parametric design and optimisation of an anthropomorphic, dexterous robotic hand

    Parametric mechanical design and optimisation of the Canterbury Hand.

    Get PDF
    As part of worldwide research humanoid robots have been developed for household, industrial and exploratory applications. If such robots are to interact with people and human created environments they will require human-like hands. The objective of this thesis was the parametric design and optimisation of a dexterous, and anthropomorphic robotic end effector. Known as the ‘Canterbury Hand’ it has 11 degree of freedoms with four fingers and a thumb. The hand has applications for dexterous teleoperation and object manipulation in industrial, hazardous or uncertain environments such as orbital robotics. The human hand was analysed so that the Canterbury Hand could copy its motions, appearance and grasp types. An analysis of the current literature on experimental prosthetic and robotic hands was also carried out. A disadvantage of many of these hand designs was that they were remotely powered using large, heavy actuator packs. The advantage of the Canterbury Hand is that it has been designed to hold the motors, wires, and circuit boards entirely within itself; although a belt carried battery pack is required. The hand was modelled using a parametric 3D computer aided design (CAD) program. Two different configurations of the hand were created in the model. One configuration, as a dexterous robot hand, used Ø13mm 3 Watt DC motors, while the other used Ø10mm, 0.5 Watt DC motors (although this hand is still slightly too large for a general prosthesis). The parts within the hand were modelled to permit changes to the geometry. This was necessary for the optimisation process. The bearing geometry of the finger and thumb linkages, as well as the thumb rotation axis was optimised for anthropomorphic motion, appearance and increased force output. A design table within a spreadsheet was created to interact with the CAD models of the hand to quickly implement the optimised geometry. The work reported in this thesis has shown the possibilities for parametric design and optimisation of an anthropomorphic, dexterous robotic hand

    Real-time Immersive human-computer interaction based on tracking and recognition of dynamic hand gestures

    Get PDF
    With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world. Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time

    Objekt-Manipulation und Steuerung der Greifkraft durch Verwendung von Taktilen Sensoren

    Get PDF
    This dissertation describes a new type of tactile sensor and an improved version of the dynamic tactile sensing approach that can provide a regularly updated and accurate estimate of minimum applied forces for use in the control of gripper manipulation. The pre-slip sensing algorithm is proposed and implemented into two-finger robot gripper. An algorithm that can discriminate between types of contact surface and recognize objects at the contact stage is also proposed. A technique for recognizing objects using tactile sensor arrays, and a method based on the quadric surface parameter for classifying grasped objects is described. Tactile arrays can recognize surface types on contact, making it possible for a tactile system to recognize translation, rotation, and scaling of an object independently.Diese Dissertation beschreibt eine neue Art von taktilen Sensoren und einen verbesserten Ansatz zur dynamischen Erfassung von taktilen daten, der in regelmĂ€ĂŸigen ZeitabstĂ€nden eine genaue Bewertung der minimalen Greifkraft liefert, die zur Steuerung des Greifers nötig ist. Ein Berechnungsverfahren zur Voraussage des Schlupfs, das in einen Zwei-Finger-Greifarm eines Roboters eingebaut wurde, wird vorgestellt. Auch ein Algorithmus zur Unterscheidung von verschiedenen OberflĂ€chenarten und zur Erkennung von Objektformen bei der BerĂŒhrung wird vorgestellt. Ein Verfahren zur Objekterkennung mit Hilfe einer Matrix aus taktilen Sensoren und eine Methode zur Klassifikation ergriffener Objekte, basierend auf den Daten einer rechteckigen OberflĂ€che, werden beschrieben. Mit Hilfe dieser Matrix können unter schiedliche Arten von OberflĂ€chen bei BerĂŒhrung erkannt werden, was es fĂŒr das Tastsystem möglich macht, Verschiebung, Drehung und GrĂ¶ĂŸe eines Objektes unabhĂ€ngig voneinander zu erkennen

    Humanoid Robots

    Get PDF
    For many years, the human being has been trying, in all ways, to recreate the complex mechanisms that form the human body. Such task is extremely complicated and the results are not totally satisfactory. However, with increasing technological advances based on theoretical and experimental researches, man gets, in a way, to copy or to imitate some systems of the human body. These researches not only intended to create humanoid robots, great part of them constituting autonomous systems, but also, in some way, to offer a higher knowledge of the systems that form the human body, objectifying possible applications in the technology of rehabilitation of human beings, gathering in a whole studies related not only to Robotics, but also to Biomechanics, Biomimmetics, Cybernetics, among other areas. This book presents a series of researches inspired by this ideal, carried through by various researchers worldwide, looking for to analyze and to discuss diverse subjects related to humanoid robots. The presented contributions explore aspects about robotic hands, learning, language, vision and locomotion

    Real-time immersive human-computer interaction based on tracking and recognition of dynamic hand gestures

    Get PDF
    With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world. Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    • 

    corecore