900 research outputs found

    Autonomous Underwater Intervention: Experimental Results of the MARIS Project

    Get PDF
    open11noopenSimetti, E. ;Wanderlingh, F. ;Torelli, S. ;Bibuli, M. ;Odetti, A. ;Bruzzone, G. ; Lodi Rizzini, D. ;Aleotti, J. ;Palli, G. ;Moriello, L. ;Scarcia, U.Simetti, E.; Wanderlingh, F.; Torelli, S.; Bibuli, M.; Odetti, Angelo; Bruzzone, G.; Lodi Rizzini, D.; Aleotti, J.; Palli, G.; Moriello, L.; Scarcia, U

    Towards tactile sensing active capsule endoscopy

    Get PDF
    Examination of the gastrointestinal(GI) tract has traditionally been performed using tethered endoscopy tools with limited reach and more recently with passive untethered capsule endoscopy with limited capability. Inspection of small intestines is only possible using the latter capsule endoscopy with on board camera system. Limited to visual means it cannot detect features beneath the lumen wall if they have not affected the lumen structure or colour. This work presents an improved capsule endoscopy system with locomotion for active exploration of the small intestines and tactile sensing to detect deformation of the capsule outer surface when it follows the intestinal wall. In laboratory conditions this system is capable of identifying sub-lumen features such as submucosal tumours.Through an extensive literary review the current state of GI tract inspection in particular using remote operated miniature robotics, was investigated, concluding no solution currently exists that utilises tactile sensing with a capsule endoscopy. In order to achieve such a platform, further investigation was made in to tactile sensing technologies, methods of locomotion through the gut, and methods to support an increased power requirement for additional electronics and actuation. A set of detailed criteria were compiled for a soft formed sensor and flexible bodied locomotion system. The sensing system is built on the biomimetic tactile sensing device, Tactip, \cite{Chorley2008, Chorley2010, Winstone2012, Winstone2013} which has been redesigned to fit the form of a capsule endoscopy. These modifications have required a 360o360^{o} cylindrical sensing surface with 360o360^{o} panoramic optical system. Multi-material 3D printing has been used to build an almost complete sensor assembly with a combination of hard and soft materials, presenting a soft compliant tactile sensing system that mimics the tactile sensing methods of the human finger. The cylindrical Tactip has been validated using artificial submucosal tumours in laboratory conditions. The first experiment has explored the new form factor and measured the device's ability to detect surface deformation when travelling through a pipe like structure with varying lump obstructions. Sensor data was analysed and used to reconstruct the test environment as a 3D rendered structure. A second tactile sensing experiment has explored the use of classifier algorithms to successfully discriminate between three tumour characteristics; shape, size and material hardness. Locomotion of the capsule endoscopy has explored further bio-inspiration from earthworm's peristaltic locomotion, which share operating environment similarities. A soft bodied peristaltic worm robot has been developed that uses a tuned planetary gearbox mechanism to displace tendons that contract each worm segment. Methods have been identified to optimise the gearbox parameter to a pipe like structure of a given diameter. The locomotion system has been tested within a laboratory constructed pipe environment, showing that using only one actuator, three independent worm segments can be controlled. This configuration achieves comparable locomotion capabilities to that of an identical robot with an actuator dedicated to each individual worm segment. This system can be miniaturised more easily due to reduced parts and number of actuators, and so is more suitable for capsule endoscopy. Finally, these two developments have been integrated to demonstrate successful simultaneous locomotion and sensing to detect an artificial submucosal tumour embedded within the test environment. The addition of both tactile sensing and locomotion have created a need for additional power beyond what is available from current battery technology. Early stage work has reviewed wireless power transfer (WPT) as a potential solution to this problem. Methods for optimisation and miniaturisation to implement WPT on a capsule endoscopy have been identified with a laboratory built system that validates the methods found. Future work would see this combined with a miniaturised development of the robot presented. This thesis has developed a novel method for sub-lumen examination. With further efforts to miniaturise the robot it could provide a comfortable and non-invasive procedure to GI tract inspection reducing the need for surgical procedures and accessibility for earlier stage of examination. Furthermore, these developments have applicability in other domains such as veterinary medicine, industrial pipe inspection and exploration of hazardous environments

    Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions

    Get PDF
    In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation. The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation. This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering. A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach. Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robot’s space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts
    • …
    corecore