2,502 research outputs found

    Towards tactile sensing active capsule endoscopy

    Get PDF
    Examination of the gastrointestinal(GI) tract has traditionally been performed using tethered endoscopy tools with limited reach and more recently with passive untethered capsule endoscopy with limited capability. Inspection of small intestines is only possible using the latter capsule endoscopy with on board camera system. Limited to visual means it cannot detect features beneath the lumen wall if they have not affected the lumen structure or colour. This work presents an improved capsule endoscopy system with locomotion for active exploration of the small intestines and tactile sensing to detect deformation of the capsule outer surface when it follows the intestinal wall. In laboratory conditions this system is capable of identifying sub-lumen features such as submucosal tumours.Through an extensive literary review the current state of GI tract inspection in particular using remote operated miniature robotics, was investigated, concluding no solution currently exists that utilises tactile sensing with a capsule endoscopy. In order to achieve such a platform, further investigation was made in to tactile sensing technologies, methods of locomotion through the gut, and methods to support an increased power requirement for additional electronics and actuation. A set of detailed criteria were compiled for a soft formed sensor and flexible bodied locomotion system. The sensing system is built on the biomimetic tactile sensing device, Tactip, \cite{Chorley2008, Chorley2010, Winstone2012, Winstone2013} which has been redesigned to fit the form of a capsule endoscopy. These modifications have required a 360o360^{o} cylindrical sensing surface with 360o360^{o} panoramic optical system. Multi-material 3D printing has been used to build an almost complete sensor assembly with a combination of hard and soft materials, presenting a soft compliant tactile sensing system that mimics the tactile sensing methods of the human finger. The cylindrical Tactip has been validated using artificial submucosal tumours in laboratory conditions. The first experiment has explored the new form factor and measured the device's ability to detect surface deformation when travelling through a pipe like structure with varying lump obstructions. Sensor data was analysed and used to reconstruct the test environment as a 3D rendered structure. A second tactile sensing experiment has explored the use of classifier algorithms to successfully discriminate between three tumour characteristics; shape, size and material hardness. Locomotion of the capsule endoscopy has explored further bio-inspiration from earthworm's peristaltic locomotion, which share operating environment similarities. A soft bodied peristaltic worm robot has been developed that uses a tuned planetary gearbox mechanism to displace tendons that contract each worm segment. Methods have been identified to optimise the gearbox parameter to a pipe like structure of a given diameter. The locomotion system has been tested within a laboratory constructed pipe environment, showing that using only one actuator, three independent worm segments can be controlled. This configuration achieves comparable locomotion capabilities to that of an identical robot with an actuator dedicated to each individual worm segment. This system can be miniaturised more easily due to reduced parts and number of actuators, and so is more suitable for capsule endoscopy. Finally, these two developments have been integrated to demonstrate successful simultaneous locomotion and sensing to detect an artificial submucosal tumour embedded within the test environment. The addition of both tactile sensing and locomotion have created a need for additional power beyond what is available from current battery technology. Early stage work has reviewed wireless power transfer (WPT) as a potential solution to this problem. Methods for optimisation and miniaturisation to implement WPT on a capsule endoscopy have been identified with a laboratory built system that validates the methods found. Future work would see this combined with a miniaturised development of the robot presented. This thesis has developed a novel method for sub-lumen examination. With further efforts to miniaturise the robot it could provide a comfortable and non-invasive procedure to GI tract inspection reducing the need for surgical procedures and accessibility for earlier stage of examination. Furthermore, these developments have applicability in other domains such as veterinary medicine, industrial pipe inspection and exploration of hazardous environments

    Toward Bio-Inspired Tactile Sensing Capsule Endoscopy for Detection of Submucosal Tumors

    Get PDF
    © 2016 IEEE. Here, we present a method for lump characterization using a bio-inspired remote tactile sensing capsule endoscopy system. While current capsule endoscopy utilizes cameras to diagnose lesions on the surface of the gastrointestinal tract lumen, this proposal uses remote palpation to stimulate a bio-inspired tactile sensing surface that deforms under the impression of both hard and soft raised objects. Current capsule endoscopy utilizes cameras to visually diagnose lesions on the surface of the gastrointestinal tract. Our approach introduces remote palpation by deploying a bio-inspired tactile sensor that deforms when pressed against soft or hard lumps. This can enhance visual inspection of lesions and provide more information about the structure of the lesions. Using classifier systems, we have shown that lumps of different sizes, shapes, and hardnesses can be distinguished in a synthetic test environment. This is a promising early start toward achieving a remote palpation system used inside the GI tract that will utilize the clinician's sense of touch

    Forefinger direction based haptic robot control for physically challenged using MEMS sensor

    Get PDF
    The ability to feel the world through the tools we hold is Haptic Touch. The sensory element that will transform information into experience by remotely interacting with things is challenging. This paper deals with design and implementation of fore finger direction based robot for physically challenged people. The design of the system includes microcontroller, MEMS sensor and RF technology. The robot system receives the command from the MEMS sensor which is placed on the fore finger at the transmitter section. Robot will follow the direction in which we show our Forefinger. The path way of the robot may be either point-to-point or continuous. This sensor can be able to detect the direction of Forefinger and the output is transmitted via RF transmitter. In the receiver section RF receiver which receives corresponding signal will command microcontroller to move robot in that particular direction. Therefore the simple control mechanism of the robot is shown. Experimental results for fore finger based directional robot are enumerated

    Haptic assessment of tissue stiffness in locating and identifying gynaecological cancer in human tissue

    Get PDF
    Gynaecological surgeons are not able to gather adequate tissue feedback during minimal access surgery for cancer treatment. This can result in failure to locate tumour boundaries and to ensure these are completely resected within tumour-free resection margins. Surgeons achieve significantly better surgical and oncological outcomes if they can identify the precise location of a gynaecological tumour. Indeed, the true nature of tumour, whether benign or cancerous, is often not known prior to surgery. If more details were available in relation to the characteristics that differentiate gynaecological cancer in tumours, this would enable more accurate diagnosis and help in the planning of surgery. HYPOTHESIS: Haptic technology has the potential to enhance the surgeon’s degree of perception during minimal access surgery. Alteration in tissue stiffness in gynaecological tumours, thought to be associated with the accelerated multiplication of cancer cells, should allow their location to be identified and help in determining the likelihood of malignancy. METHOD: Setting: (i) Guy's & St Thomas' Hospital (ii) Dept of Informatics (King's College London).Permission from the National Research Ethics Committee and Research & Development (R&D) approval were sought from the National Health Service. The Phantom Omni, capable of 3D motion tracking, attached to a nano-17 force sensor, was used to capture real-time position data and force data. Uniaxial indentation palpation behaviour was used. The indentation depth was calculated using the displacement of the probe from the surface to the deepest point for each contact. The tissue stiffness (TS) was then calculated.The haptic probe was tested first on silicone models with embedded nodules mimicking tumour(s). This was followed by assessing TS ex-vivo using a haptic probe on fresh human gynaecological organs that had been removed in surgery. Tissue stiffness maps were generated in real time using the haptic device by converting stiffness values into RGB values. Surgeons also manually palpated and recorded the site of the tumour. Histology was used as the gold standard for location and cancer diagnosis. Manual palpation and haptic data were compared for accuracy on tumour location. The tissue stiffness calculated by the haptic probe was compared in cancer and control specimens. Several data analysis techniques were applied to derive results.CONTRIBUTIONS: Haptic indentation probe was tested for the first time on fresh human gynaecological organs to locate cancer in a clinical setting. We are the first one to evaluate the accuracy of cancer diagnosis in human gynaecological organs with a force sensing haptic indentation probe measuring tissue stiffness

    The Boston University Photonics Center annual report 2014-2015

    Full text link
    This repository item contains an annual report that summarizes activities of the Boston University Photonics Center in the 2014-2015 academic year. The report provides quantitative and descriptive information regarding photonics programs in education, interdisciplinary research, business innovation, and technology development. The Boston University Photonics Center (BUPC) is an interdisciplinary hub for education, research, scholarship, innovation, and technology development associated with practical uses of light.This has been a good year for the Photonics Center. In the following pages, you will see that the center’s faculty received prodigious honors and awards, generated more than 100 notable scholarly publications in the leading journals in our field, and attracted $18.6M in new research grants/contracts. Faculty and staff also expanded their efforts in education and training, and were awarded two new National Science Foundation– sponsored sites for Research Experiences for Undergraduates and for Teachers. As a community, we hosted a compelling series of distinguished invited speakers, and emphasized the theme of Advanced Materials by Design for the 21st Century at our annual symposium. We continued to support the National Photonics Initiative, and are a part of a New York–based consortium that won the competition for a new photonics- themed node in the National Network of Manufacturing Institutes. Highlights of our research achievements for the year include an ambitious new DoD-sponsored grant for Multi-Scale Multi-Disciplinary Modeling of Electronic Materials led by Professor Enrico Bellotti, continued support of our NIH-sponsored Center for Innovation in Point of Care Technologies for the Future of Cancer Care led by Professor Catherine Klapperich, a new award for Personalized Chemotherapy Through Rapid Monitoring with Wearable Optics led by Assistant Professor Darren Roblyer, and a new award from DARPA to conduct research on Calligraphy to Build Tunable Optical Metamaterials led by Professor Dave Bishop. We were also honored to receive an award from the Massachusetts Life Sciences Center to develop a biophotonics laboratory in our Business Innovation Center

    Optical Fibre-based Force Sensing Needle Driver for Minimally Invasive Surgery

    Get PDF
    Minimally invasive surgery has been limited from its inception by insufficient haptic feedback to surgeons. The loss of haptic information threatens patients safety and results in longer operation times. To address this problem, various force sensing systems have been developed to provide information about tool–tissue interaction forces. However, the provided results for axial and grasping forces have been inaccurate in most of these studies due to considerable amount of error and uncertainty in their force acquisition method. Furthermore, sterilizability of the sensorized instruments plays a pivotal role in accurate measurement of forces inside a patient\u27s body. Therefore, the objective of this thesis was to develop a sterilizable needle-driver type grasper using fibre Bragg gratings. In order to measure more accurate and reliable tool–tissue interaction forces, optical force sensors were integrated in the grasper jaw to measure axial and grasping forces directly at their exertion point on the tool tip. Two sets of sensor prototypes were developed to prove the feasibility of proposed concept. Implementation of this concept into a needle-driver instrument resulted in the final proposed model of the sensorized laparoscopic instrument. Fibre Bragg gratings were used for measuring forces due to their many advantages for this application such as small size, sterilizability and high sensitivity. Visual force feedback was provided for users based on the acquired real-time force data. Improvement and consideration points related to the current work were identified and potential areas to continue this project in the future are discussed

    The Boston University Photonics Center annual report 2014-2015

    Full text link
    This repository item contains an annual report that summarizes activities of the Boston University Photonics Center in the 2014-2015 academic year. The report provides quantitative and descriptive information regarding photonics programs in education, interdisciplinary research, business innovation, and technology development. The Boston University Photonics Center (BUPC) is an interdisciplinary hub for education, research, scholarship, innovation, and technology development associated with practical uses of light.This has been a good year for the Photonics Center. In the following pages, you will see that the center’s faculty received prodigious honors and awards, generated more than 100 notable scholarly publications in the leading journals in our field, and attracted $18.6M in new research grants/contracts. Faculty and staff also expanded their efforts in education and training, and were awarded two new National Science Foundation– sponsored sites for Research Experiences for Undergraduates and for Teachers. As a community, we hosted a compelling series of distinguished invited speakers, and emphasized the theme of Advanced Materials by Design for the 21st Century at our annual symposium. We continued to support the National Photonics Initiative, and are a part of a New York–based consortium that won the competition for a new photonics- themed node in the National Network of Manufacturing Institutes. Highlights of our research achievements for the year include an ambitious new DoD-sponsored grant for Multi-Scale Multi-Disciplinary Modeling of Electronic Materials led by Professor Enrico Bellotti, continued support of our NIH-sponsored Center for Innovation in Point of Care Technologies for the Future of Cancer Care led by Professor Catherine Klapperich, a new award for Personalized Chemotherapy Through Rapid Monitoring with Wearable Optics led by Assistant Professor Darren Roblyer, and a new award from DARPA to conduct research on Calligraphy to Build Tunable Optical Metamaterials led by Professor Dave Bishop. We were also honored to receive an award from the Massachusetts Life Sciences Center to develop a biophotonics laboratory in our Business Innovation Center

    Haptic interface based on tactile sensors for assistive devices

    Get PDF
    Tesis leída el 14 de febrero de 2018.Los países desarrollados deben hacer frente al creciente envejecimiento de su población. Un proceso de envejecimiento adecuado requiere capacidad funcional en las actividades del día a día. Así, las tecnologías de asistencia deben lidiar con uno de los principales problemas asociados con la edad: el deterioro de la movilidad. Los bastones y los andadores son prescritos para personas con movilidad reducida, pero aún con capacidad de andar. Sin embargo, hay un considerable número de personas en la tercera edad que necesitan otro tipo de ayuda. En este sentido, las sillas de ruedas eléctricas suponen un medio para el aumento de la participación y de la actividad de sus usuarios. Normalmente, estas sillas se conducen mediante un joystick alojado al final de uno de los reposabrazos. No obstante, este dispositivo no es adecuado para todo tipo de usuarios. Algunos de ellos lo encuentran difícil de usar y, para otros, su manejo no es posible y necesitan de la asistencia de otra persona (aquellos que padecen ciertas enfermedades del sistema nervioso, lesiones en la médula espinal, discapacidad mental, etc.). De esta manera, hay casos en que se requiere la ayuda de un cuidador que desplace la silla. Empujar una silla de ruedas de forma habitual produce distintos tipos de lesiones, por lo que es interesante que los asistentes o cuidadores también se beneficien de las ventajas de las sillas de ruedas eléctricas. En este caso, la solución más común consiste en otro joystick situado en la parte trasera de la silla. Como se ha apuntado anteriormente, este no es un dispositivo cómodo e intuitivo para muchos usuarios. Con respecto a la investigación, con frecuencia los dispositivos de asistencia propuestos basan su interfaz con el asistente en sensores de fuerza. Estos componentes son caros y suponen por tanto una barrera de cara a que el dispositivo llegue al mercado
    corecore