30 research outputs found

    Toward Bio-Inspired Tactile Sensing Capsule Endoscopy for Detection of Submucosal Tumors

    Get PDF
    © 2016 IEEE. Here, we present a method for lump characterization using a bio-inspired remote tactile sensing capsule endoscopy system. While current capsule endoscopy utilizes cameras to diagnose lesions on the surface of the gastrointestinal tract lumen, this proposal uses remote palpation to stimulate a bio-inspired tactile sensing surface that deforms under the impression of both hard and soft raised objects. Current capsule endoscopy utilizes cameras to visually diagnose lesions on the surface of the gastrointestinal tract. Our approach introduces remote palpation by deploying a bio-inspired tactile sensor that deforms when pressed against soft or hard lumps. This can enhance visual inspection of lesions and provide more information about the structure of the lesions. Using classifier systems, we have shown that lumps of different sizes, shapes, and hardnesses can be distinguished in a synthetic test environment. This is a promising early start toward achieving a remote palpation system used inside the GI tract that will utilize the clinician's sense of touch

    Towards tactile sensing active capsule endoscopy

    Get PDF
    Examination of the gastrointestinal(GI) tract has traditionally been performed using tethered endoscopy tools with limited reach and more recently with passive untethered capsule endoscopy with limited capability. Inspection of small intestines is only possible using the latter capsule endoscopy with on board camera system. Limited to visual means it cannot detect features beneath the lumen wall if they have not affected the lumen structure or colour. This work presents an improved capsule endoscopy system with locomotion for active exploration of the small intestines and tactile sensing to detect deformation of the capsule outer surface when it follows the intestinal wall. In laboratory conditions this system is capable of identifying sub-lumen features such as submucosal tumours.Through an extensive literary review the current state of GI tract inspection in particular using remote operated miniature robotics, was investigated, concluding no solution currently exists that utilises tactile sensing with a capsule endoscopy. In order to achieve such a platform, further investigation was made in to tactile sensing technologies, methods of locomotion through the gut, and methods to support an increased power requirement for additional electronics and actuation. A set of detailed criteria were compiled for a soft formed sensor and flexible bodied locomotion system. The sensing system is built on the biomimetic tactile sensing device, Tactip, \cite{Chorley2008, Chorley2010, Winstone2012, Winstone2013} which has been redesigned to fit the form of a capsule endoscopy. These modifications have required a 360o360^{o} cylindrical sensing surface with 360o360^{o} panoramic optical system. Multi-material 3D printing has been used to build an almost complete sensor assembly with a combination of hard and soft materials, presenting a soft compliant tactile sensing system that mimics the tactile sensing methods of the human finger. The cylindrical Tactip has been validated using artificial submucosal tumours in laboratory conditions. The first experiment has explored the new form factor and measured the device's ability to detect surface deformation when travelling through a pipe like structure with varying lump obstructions. Sensor data was analysed and used to reconstruct the test environment as a 3D rendered structure. A second tactile sensing experiment has explored the use of classifier algorithms to successfully discriminate between three tumour characteristics; shape, size and material hardness. Locomotion of the capsule endoscopy has explored further bio-inspiration from earthworm's peristaltic locomotion, which share operating environment similarities. A soft bodied peristaltic worm robot has been developed that uses a tuned planetary gearbox mechanism to displace tendons that contract each worm segment. Methods have been identified to optimise the gearbox parameter to a pipe like structure of a given diameter. The locomotion system has been tested within a laboratory constructed pipe environment, showing that using only one actuator, three independent worm segments can be controlled. This configuration achieves comparable locomotion capabilities to that of an identical robot with an actuator dedicated to each individual worm segment. This system can be miniaturised more easily due to reduced parts and number of actuators, and so is more suitable for capsule endoscopy. Finally, these two developments have been integrated to demonstrate successful simultaneous locomotion and sensing to detect an artificial submucosal tumour embedded within the test environment. The addition of both tactile sensing and locomotion have created a need for additional power beyond what is available from current battery technology. Early stage work has reviewed wireless power transfer (WPT) as a potential solution to this problem. Methods for optimisation and miniaturisation to implement WPT on a capsule endoscopy have been identified with a laboratory built system that validates the methods found. Future work would see this combined with a miniaturised development of the robot presented. This thesis has developed a novel method for sub-lumen examination. With further efforts to miniaturise the robot it could provide a comfortable and non-invasive procedure to GI tract inspection reducing the need for surgical procedures and accessibility for earlier stage of examination. Furthermore, these developments have applicability in other domains such as veterinary medicine, industrial pipe inspection and exploration of hazardous environments

    MultiTip:A multimodal mechano-thermal soft fingertip

    Get PDF

    Fine-grained Haptics: Sensing and Actuating Haptic Primary Colours (force, vibration, and temperature)

    Get PDF
    This thesis discusses the development of a multimodal, fine-grained visual-haptic system for teleoperation and robotic applications. This system is primarily composed of two complementary components: an input device known as the HaptiTemp sensor (combines “Haptics” and “Temperature”), which is a novel thermosensitive GelSight-like sensor, and an output device, an untethered multimodal finegrained haptic glove. The HaptiTemp sensor is a visuotactile sensor that can sense haptic primary colours known as force, vibration, and temperature. It has novel switchable UV markers that can be made visible using UV LEDs. The switchable markers feature is a real novelty of the HaptiTemp because it can be used in the analysis of tactile information from gel deformation without impairing the ability to classify or recognise images. The use of switchable markers in the HaptiTemp sensor is the solution to the trade-off between marker density and capturing high-resolution images using one sensor. The HaptiTemp sensor can measure vibrations by counting the number of blobs or pulses detected per unit time using a blob detection algorithm. For the first time, temperature detection was incorporated into a GelSight-like sensor, making the HaptiTemp sensor a haptic primary colours sensor. The HaptiTemp sensor can also do rapid temperature sensing with a 643 ms response time for the 31°C to 50°C temperature range. This fast temperature response of the HaptiTemp sensor is comparable to the withdrawal reflex response in humans. This is the first time a sensor can trigger a sensory impulse that can mimic a human reflex in the robotic community. The HaptiTemp sensor can also do simultaneous temperature sensing and image classification using a machine vision camera—the OpenMV Cam H7 Plus. This capability of simultaneous sensing and image classification has not been reported or demonstrated by any tactile sensor. The HaptiTemp sensor can be used in teleoperation because it can communicate or transmit tactile analysis and image classification results using wireless communication. The HaptiTemp sensor is the closest thing to the human skin in tactile sensing, tactile pattern recognition, and rapid temperature response. In order to feel what the HaptiTemp sensor is touching from a distance, a corresponding output device, an untethered multimodal haptic hand wearable, is developed to actuate the haptic primary colours sensed by the HaptiTemp sensor. This wearable can communicate wirelessly and has fine-grained cutaneous feedback to feel the edges or surfaces of the tactile images captured by the HaptiTemp sensor. This untethered multimodal haptic hand wearable has gradient kinesthetic force feedback that can restrict finger movements based on the force estimated by the HaptiTemp sensor. A retractable string from an ID badge holder equipped with miniservos that control the stiffness of the wire is attached to each fingertip to restrict finger movements. Vibrations detected by the HaptiTemp sensor can be actuated by the tapping motion of the tactile pins or by a buzzing minivibration motor. There is also a tiny annular Peltier device, or ThermoElectric Generator (TEG), with a mini-vibration motor, forming thermo-vibro feedback in the palm area that can be activated by a ‘hot’ or ‘cold’ signal from the HaptiTemp sensor. The haptic primary colours can also be embedded in a VR environment that can be actuated by the multimodal hand wearable. A VR application was developed to demonstrate rapid tactile actuation of edges, allowing the user to feel the contours of virtual objects. Collision detection scripts were embedded to activate the corresponding actuator in the multimodal haptic hand wearable whenever the tactile matrix simulator or hand avatar in VR collides with a virtual object. The TEG also gets warm or cold depending on the virtual object the participant has touched. Tests were conducted to explore virtual objects in 2D and 3D environments using Leap Motion control and a VR headset (Oculus Quest 2). Moreover, a fine-grained cutaneous feedback was developed to feel the edges or surfaces of a tactile image, such as the tactile images captured by the HaptiTemp sensor, or actuate tactile patterns in 2D or 3D virtual objects. The prototype is like an exoskeleton glove with 16 tactile actuators (tactors) on each fingertip, 80 tactile pins in total, made from commercially available P20 Braille cells. Each tactor can be controlled individually to enable the user to feel the edges or surfaces of images, such as the high-resolution tactile images captured by the HaptiTemp sensor. This hand wearable can be used to enhance the immersive experience in a virtual reality environment. The tactors can be actuated in a tapping manner, creating a distinct form of vibration feedback as compared to the buzzing vibration produced by a mini-vibration motor. The tactile pin height can also be varied, creating a gradient of pressure on the fingertip. Finally, the integration of the high-resolution HaptiTemp sensor, and the untethered multimodal, fine-grained haptic hand wearable is presented, forming a visuotactile system for sensing and actuating haptic primary colours. Force, vibration, and temperature sensing tests with corresponding force, vibration, and temperature actuating tests have demonstrated a unified visual-haptic system. Aside from sensing and actuating haptic primary colours, touching the edges or surfaces of the tactile images captured by the HaptiTemp sensor was carried out using the fine-grained cutaneous feedback of the haptic hand wearable

    Development of PVDF tactile dynamic sensing in a behaviour-based assembly robot

    Get PDF
    The research presented in this thesis focuses on the development of tactile event sig¬ nature sensors and their application, especially in reactive behaviour-based robotic assembly systems.In pursuit of practical and economic sensors for detecting part contact, the application ofPVDF (polyvinylidene fluoride) film, a mechanical vibration sensitive piezo material, is investigated. A Clunk Sensor is developed which remotely detects impact vibrations, and a Push Sensor is developed which senses small changes in the deformation of a compliant finger surface. The Push Sensor is further developed to provide some force direction and force pattern sensing capability.By being able to detect changes of state in an assembly, such as a change of contact force, an assembly robot can be well informed of current conditions. The complex structure of assembly tasks provides a rich context within which to interpret changes of state, so simple binary sensors can conveniently supply a lot more information than in the domain of mobile robots. Guarded motions, for example, which require sensing a change of state, have long been recognised as very useful in part mating tasks. Guarded motions are particularly well suited to be components of assembly behavioural modules.In behaviour-based robotic assembly systems, the high level planner is endowed with as little complexity as possible while the low level planning execution agent deals with actual sensing and action. Highly reactive execution agents can provide advantages by encapsulating low level sensing and action, hiding the details of sensori-motor complexity from the higher levels.Because behaviour-based assembly systems emphasise the utility of this kind of quali¬ tative state-change sensor (as opposed to sensors which measure physical quantities), the robustness and utility of the Push Sensor was tested in an experimental behaviourbased system. An experimental task of pushing a ring along a convoluted stiff wire is chosen, in which the tactile sensors developed here are aided by vision. Three differ¬ ent methods of combining these different sensors within the general behaviour-based paradigm are implemented and compared. This exercise confirms the robustness and utility of the PVDF-based tactile sensors. We argue that the comparison suggests that for behaviour-based assembly systems using multiple concurrent sensor systems, bottom-level motor control in terms of force or velocity would be more appropriate than positional control. Behaviour-based systems have traditionally tried to avoid symbolic knowledge. Considering this in the light of the above work, it was found useful to develop a taxonomy of type of knowledge and refine the prohibition

    Contemporary Robotics

    Get PDF
    This book book is a collection of 18 chapters written by internationally recognized experts and well-known professionals of the field. Chapters contribute to diverse facets of contemporary robotics and autonomous systems. The volume is organized in four thematic parts according to the main subjects, regarding the recent advances in the contemporary robotics. The first thematic topics of the book are devoted to the theoretical issues. This includes development of algorithms for automatic trajectory generation using redudancy resolution scheme, intelligent algorithms for robotic grasping, modelling approach for reactive mode handling of flexible manufacturing and design of an advanced controller for robot manipulators. The second part of the book deals with different aspects of robot calibration and sensing. This includes a geometric and treshold calibration of a multiple robotic line-vision system, robot-based inline 2D/3D quality monitoring using picture-giving and laser triangulation, and a study on prospective polymer composite materials for flexible tactile sensors. The third part addresses issues of mobile robots and multi-agent systems, including SLAM of mobile robots based on fusion of odometry and visual data, configuration of a localization system by a team of mobile robots, development of generic real-time motion controller for differential mobile robots, control of fuel cells of mobile robots, modelling of omni-directional wheeled-based robots, building of hunter- hybrid tracking environment, as well as design of a cooperative control in distributed population-based multi-agent approach. The fourth part presents recent approaches and results in humanoid and bioinspirative robotics. It deals with design of adaptive control of anthropomorphic biped gait, building of dynamic-based simulation for humanoid robot walking, building controller for perceptual motor control dynamics of humans and biomimetic approach to control mechatronic structure using smart materials
    corecore