650 research outputs found

    Towards tactile sensing active capsule endoscopy

    Get PDF
    Examination of the gastrointestinal(GI) tract has traditionally been performed using tethered endoscopy tools with limited reach and more recently with passive untethered capsule endoscopy with limited capability. Inspection of small intestines is only possible using the latter capsule endoscopy with on board camera system. Limited to visual means it cannot detect features beneath the lumen wall if they have not affected the lumen structure or colour. This work presents an improved capsule endoscopy system with locomotion for active exploration of the small intestines and tactile sensing to detect deformation of the capsule outer surface when it follows the intestinal wall. In laboratory conditions this system is capable of identifying sub-lumen features such as submucosal tumours.Through an extensive literary review the current state of GI tract inspection in particular using remote operated miniature robotics, was investigated, concluding no solution currently exists that utilises tactile sensing with a capsule endoscopy. In order to achieve such a platform, further investigation was made in to tactile sensing technologies, methods of locomotion through the gut, and methods to support an increased power requirement for additional electronics and actuation. A set of detailed criteria were compiled for a soft formed sensor and flexible bodied locomotion system. The sensing system is built on the biomimetic tactile sensing device, Tactip, \cite{Chorley2008, Chorley2010, Winstone2012, Winstone2013} which has been redesigned to fit the form of a capsule endoscopy. These modifications have required a 360o360^{o} cylindrical sensing surface with 360o360^{o} panoramic optical system. Multi-material 3D printing has been used to build an almost complete sensor assembly with a combination of hard and soft materials, presenting a soft compliant tactile sensing system that mimics the tactile sensing methods of the human finger. The cylindrical Tactip has been validated using artificial submucosal tumours in laboratory conditions. The first experiment has explored the new form factor and measured the device's ability to detect surface deformation when travelling through a pipe like structure with varying lump obstructions. Sensor data was analysed and used to reconstruct the test environment as a 3D rendered structure. A second tactile sensing experiment has explored the use of classifier algorithms to successfully discriminate between three tumour characteristics; shape, size and material hardness. Locomotion of the capsule endoscopy has explored further bio-inspiration from earthworm's peristaltic locomotion, which share operating environment similarities. A soft bodied peristaltic worm robot has been developed that uses a tuned planetary gearbox mechanism to displace tendons that contract each worm segment. Methods have been identified to optimise the gearbox parameter to a pipe like structure of a given diameter. The locomotion system has been tested within a laboratory constructed pipe environment, showing that using only one actuator, three independent worm segments can be controlled. This configuration achieves comparable locomotion capabilities to that of an identical robot with an actuator dedicated to each individual worm segment. This system can be miniaturised more easily due to reduced parts and number of actuators, and so is more suitable for capsule endoscopy. Finally, these two developments have been integrated to demonstrate successful simultaneous locomotion and sensing to detect an artificial submucosal tumour embedded within the test environment. The addition of both tactile sensing and locomotion have created a need for additional power beyond what is available from current battery technology. Early stage work has reviewed wireless power transfer (WPT) as a potential solution to this problem. Methods for optimisation and miniaturisation to implement WPT on a capsule endoscopy have been identified with a laboratory built system that validates the methods found. Future work would see this combined with a miniaturised development of the robot presented. This thesis has developed a novel method for sub-lumen examination. With further efforts to miniaturise the robot it could provide a comfortable and non-invasive procedure to GI tract inspection reducing the need for surgical procedures and accessibility for earlier stage of examination. Furthermore, these developments have applicability in other domains such as veterinary medicine, industrial pipe inspection and exploration of hazardous environments

    Robust Learning-Based Incipient Slip Detection using the PapillArray Optical Tactile Sensor for Improved Robotic Gripping

    Full text link
    The ability to detect slip, particularly incipient slip, enables robotic systems to take corrective measures to prevent a grasped object from being dropped. Therefore, slip detection can enhance the overall security of robotic gripping. However, accurately detecting incipient slip remains a significant challenge. In this paper, we propose a novel learning-based approach to detect incipient slip using the PapillArray (Contactile, Australia) tactile sensor. The resulting model is highly effective in identifying patterns associated with incipient slip, achieving a detection success rate of 95.6% when tested with an offline dataset. Furthermore, we introduce several data augmentation methods to enhance the robustness of our model. When transferring the trained model to a robotic gripping environment distinct from where the training data was collected, our model maintained robust performance, with a success rate of 96.8%, providing timely feedback for stabilizing several practical gripping tasks. Our project website: https://sites.google.com/view/incipient-slip-detection

    Robotic simulators for tissue examination training with multimodal sensory feedback

    Get PDF
    Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators

    The TacTip Family : Soft Optical Tactile Sensors with 3D-Printed Biomimetic Morphologies

    Get PDF
    The authors thank Sam Coupland, Gareth Griffiths, and Samuel Forbes for their help with 3D printing and Jason Welsby for his assistance with electronics. N.L. was supported, in part, by a Leverhulme Trust Research Leadership Award on “A biomimetic forebrain for robot touch” (RL-2016-039), and N.L. and M.E.G. were supported, in part, by an EPSRC grant on Tactile Super-resolution Sensing (EP/M02993X/1). L.C. was supported by the EPSRC Centre for Doctoral Training in Future Autonomous and Robotic Systems (FARSCOPE).Peer reviewedPublisher PD

    Neuromorphic Computing Systems for Tactile Sensing Perception

    Get PDF
    Touch sensing plays an important role in humans daily life. Tasks like exploring, grasping and manipulating objects deeply rely on it. As such, Robots and hand prosthesis endowed with the sense of touch can better and more easily manipulate objects, and physically collaborate with other agents. Towards this goal, information about touched objects and surfaces has to be inferred from raw data coming from the sensors. The orientation of edges, which is employed as a pre-processing stage in both artificial vision and touch, is a key indication for object discrimination. Inspired on the encoding of edges in human first-order tactile afferents, we developed a biologically inspired, spiking models architecture that mimics human tactile perception with computational primitives that are implementable on low-power subthreshold neuromorphic hardware. The network architecture uses three layers of Leaky Integrate and Fire neurons to distinguish different edge orientations of a bar pressed on the artificial skin of the iCub robot. We demonstrated that the network architecture can learn the appropriate connectivity through unsupervised spike-based learning, and that the number and spatial distribution of sensitive areas within receptive fields are important in edge orientation discrimination. The unconstrained and random structure of the connectivity among layers can produce unbalanced activity in the output neurons, which are driven by a variable amount of synaptic inputs. We explored two different mechanisms of synaptic normalization (weights normalization and homeostasis), defining how this can be useful during the learning phase and inference phase. The network is successfully able to discriminate between 35 orientations of 36 (0 degree to 180 degree with 5 degree step increments) with homeostasis and weights normalization mechanism. Besides edge orientation discrimination, we modified the network architecture to be able to classify six different touch modalities (e.g. poke, press, grab, squeeze, push, and rolling a wheel). We demonstrated the ability of the network to learn appropriate connectivity patterns for the classification, achieving a total accuracy of 88.3 %. Furthermore, another application scenario on the tactile object shapes recognition has been considered because of its importance in robotic manipulation. We illustrated that the network architecture with 2 layers of spiking neurons was able to discriminate the tactile object shapes with accuracy 100 %, after integrating to it an array of 160 piezoresistive tactile sensors where the object shapes are applied
    corecore