86 research outputs found

    Endoscopic Tactile Capsule for Non-Polypoid Colorectal Tumour Detection

    Get PDF
    An endoscopic tactile robotic capsule, embedding miniaturized MEMS force sensors, is presented. The capsule is conceived to provide automatic palpation of non-polypoid colorectal tumours during colonoscopy, since it is characterized by high degree of dysplasia, higher invasiveness and lower detection rates with respect to polyps. A first test was performed employing a silicone phantom that embedded inclusions with variable hardness and curvature. A hardness-based classification was implemented, demonstrating detection robustness to curvature variation. By comparing a set of supervised classification algorithms, a weighted 3-nearest neighbor classifier was selected. A bias force normalization model was introduced in order to make different acquisition sets consistent. Parameters of this model were chosen through a particle swarm optimization method. Additionally, an ex-vivo test was performed to assess the capsule detection performance when magnetically-driven along a colonic tissue. Lumps were identified as voltage peaks with a prominence depending on the total magnetic force applied to the capsule. Accuracy of 94 % in hardness classification was achieved, while a 100 % accuracy is obtained for the lump detection within a tolerance of 5 mm from the central path described by the capsule. In real application scenario, we foresee our device aiding physicians to detect tumorous tissues

    Tactile Sensing for Assistive Robotics

    Get PDF

    Towards tactile sensing active capsule endoscopy

    Get PDF
    Examination of the gastrointestinal(GI) tract has traditionally been performed using tethered endoscopy tools with limited reach and more recently with passive untethered capsule endoscopy with limited capability. Inspection of small intestines is only possible using the latter capsule endoscopy with on board camera system. Limited to visual means it cannot detect features beneath the lumen wall if they have not affected the lumen structure or colour. This work presents an improved capsule endoscopy system with locomotion for active exploration of the small intestines and tactile sensing to detect deformation of the capsule outer surface when it follows the intestinal wall. In laboratory conditions this system is capable of identifying sub-lumen features such as submucosal tumours.Through an extensive literary review the current state of GI tract inspection in particular using remote operated miniature robotics, was investigated, concluding no solution currently exists that utilises tactile sensing with a capsule endoscopy. In order to achieve such a platform, further investigation was made in to tactile sensing technologies, methods of locomotion through the gut, and methods to support an increased power requirement for additional electronics and actuation. A set of detailed criteria were compiled for a soft formed sensor and flexible bodied locomotion system. The sensing system is built on the biomimetic tactile sensing device, Tactip, \cite{Chorley2008, Chorley2010, Winstone2012, Winstone2013} which has been redesigned to fit the form of a capsule endoscopy. These modifications have required a 360o360^{o} cylindrical sensing surface with 360o360^{o} panoramic optical system. Multi-material 3D printing has been used to build an almost complete sensor assembly with a combination of hard and soft materials, presenting a soft compliant tactile sensing system that mimics the tactile sensing methods of the human finger. The cylindrical Tactip has been validated using artificial submucosal tumours in laboratory conditions. The first experiment has explored the new form factor and measured the device's ability to detect surface deformation when travelling through a pipe like structure with varying lump obstructions. Sensor data was analysed and used to reconstruct the test environment as a 3D rendered structure. A second tactile sensing experiment has explored the use of classifier algorithms to successfully discriminate between three tumour characteristics; shape, size and material hardness. Locomotion of the capsule endoscopy has explored further bio-inspiration from earthworm's peristaltic locomotion, which share operating environment similarities. A soft bodied peristaltic worm robot has been developed that uses a tuned planetary gearbox mechanism to displace tendons that contract each worm segment. Methods have been identified to optimise the gearbox parameter to a pipe like structure of a given diameter. The locomotion system has been tested within a laboratory constructed pipe environment, showing that using only one actuator, three independent worm segments can be controlled. This configuration achieves comparable locomotion capabilities to that of an identical robot with an actuator dedicated to each individual worm segment. This system can be miniaturised more easily due to reduced parts and number of actuators, and so is more suitable for capsule endoscopy. Finally, these two developments have been integrated to demonstrate successful simultaneous locomotion and sensing to detect an artificial submucosal tumour embedded within the test environment. The addition of both tactile sensing and locomotion have created a need for additional power beyond what is available from current battery technology. Early stage work has reviewed wireless power transfer (WPT) as a potential solution to this problem. Methods for optimisation and miniaturisation to implement WPT on a capsule endoscopy have been identified with a laboratory built system that validates the methods found. Future work would see this combined with a miniaturised development of the robot presented. This thesis has developed a novel method for sub-lumen examination. With further efforts to miniaturise the robot it could provide a comfortable and non-invasive procedure to GI tract inspection reducing the need for surgical procedures and accessibility for earlier stage of examination. Furthermore, these developments have applicability in other domains such as veterinary medicine, industrial pipe inspection and exploration of hazardous environments

    Advanced Sensing and Image Processing Techniques for Healthcare Applications

    Get PDF
    This Special Issue aims to attract the latest research and findings in the design, development and experimentation of healthcare-related technologies. This includes, but is not limited to, using novel sensing, imaging, data processing, machine learning, and artificially intelligent devices and algorithms to assist/monitor the elderly, patients, and the disabled population

    Smart Sensors for Healthcare and Medical Applications

    Get PDF
    This book focuses on new sensing technologies, measurement techniques, and their applications in medicine and healthcare. Specifically, the book briefly describes the potential of smart sensors in the aforementioned applications, collecting 24 articles selected and published in the Special Issue “Smart Sensors for Healthcare and Medical Applications”. We proposed this topic, being aware of the pivotal role that smart sensors can play in the improvement of healthcare services in both acute and chronic conditions as well as in prevention for a healthy life and active aging. The articles selected in this book cover a variety of topics related to the design, validation, and application of smart sensors to healthcare

    The Hand-Held Force Magnifier: Surgical Tools to Augment the Sense of Touch

    Get PDF
    Modern surgeons routinely perform procedures with noisy, sub-threshold, or obscured visual and haptic feedback,either due to the necessary approach, or because the systems on which they are operating are exceeding delicate. For example, in cataract extraction, ophthalmic surgeons must peel away thin membranes in order to access and replace the lens of the eye. Elsewhere, dissection is now commonly performed with energy-delivering tools – rather than sharp blades – and damage to deep structures is possible if tissue contact is not well controlled. Surgeons compensate for their lack of tactile sensibility by relying solely on visual feedback, observing tissue deformation and other visual cues through surgical microscopes or cameras. Using visual information alone can make a procedure more difficult, because cognitive mediation is required to convert visual feedback into motor action. We call this the “haptic problem” in surgery because the human sensorimotor loop is deprived of critical tactile afferent information, increasing the chance for intraoperative injury and requiring extensive training before clinicians reach independent proficiency. Tools that enhance the surgeon’s direct perception of tool-tissue forces can therefore potentially reduce the risk of iatrogenic complications and improve patient outcomes. Towards this end, we have developed and characterized a new robotic surgical tool, the Hand-Held Force Magnifier (HHFM), which amplifies forces at the tool tip so they may be readily perceived by the user, a paradigm we call “in-situ” force feedback. In this dissertation, we describe the development of successive generations of HHFM prototypes, and the evaluation of a proposed human-in-the-loop control framework using the methods of psychophysics. Using these techniques, we have verified that our tool can reduce sensory perception thresholds, augmenting the user’s abilities beyond what is normally possible. Further, we have created models of human motor control in surgically relevant tasks such as membrane puncture, which have shown to be sensitive to push-pull direction and handedness effects. Force augmentation has also demonstrated improvements to force control in isometric force generation tasks. Finally, in support of future psychophysics work, we have developed an inexpensive, high-bandwidth, single axis haptic renderer using a commercial audio speaker

    Eye Tracking Methods for Analysis of Visuo-Cognitive Behavior in Medical Imaging

    Get PDF
    Predictive modeling of human visual search behavior and the underlying metacognitive processes is now possible thanks to significant advances in bio-sensing device technology and machine intelligence. Eye tracking bio-sensors, for example, can measure psycho-physiological response through change events in configuration of the human eye. These events include positional changes such as visual fixation, saccadic movements, and scanpath, and non-positional changes such as blinks and pupil dilation and constriction. Using data from eye-tracking sensors, we can model human perception, cognitive processes, and responses to external stimuli. In this study, we investigated the visuo-cognitive behavior of clinicians during the diagnostic decision process for breast cancer screening under clinically equivalent experimental conditions involving multiple monitors and breast projection views. Using a head-mounted eye tracking device and a customized user interface, we recorded eye change events and diagnostic decisions from 10 clinicians (three breast-imaging radiologists and seven Radiology residents) for a corpus of 100 screening mammograms (comprising cases of varied pathology and breast parenchyma density). We proposed novel features and gaze analysis techniques, which help to encode discriminative pattern changes in positional and non-positional measures of eye events. These changes were shown to correlate with individual image readers' identity and experience level, mammographic case pathology and breast parenchyma density, and diagnostic decision. Furthermore, our results suggest that a combination of machine intelligence and bio-sensing modalities can provide adequate predictive capability for the characterization of a mammographic case and image readers diagnostic performance. Lastly, features characterizing eye movements can be utilized for biometric identification purposes. These findings are impactful in real-time performance monitoring and personalized intelligent training and evaluation systems in screening mammography. Further, the developed algorithms are applicable in other application domains involving high-risk visual tasks

    Health State Estimation

    Full text link
    Life's most valuable asset is health. Continuously understanding the state of our health and modeling how it evolves is essential if we wish to improve it. Given the opportunity that people live with more data about their life today than any other time in history, the challenge rests in interweaving this data with the growing body of knowledge to compute and model the health state of an individual continually. This dissertation presents an approach to build a personal model and dynamically estimate the health state of an individual by fusing multi-modal data and domain knowledge. The system is stitched together from four essential abstraction elements: 1. the events in our life, 2. the layers of our biological systems (from molecular to an organism), 3. the functional utilities that arise from biological underpinnings, and 4. how we interact with these utilities in the reality of daily life. Connecting these four elements via graph network blocks forms the backbone by which we instantiate a digital twin of an individual. Edges and nodes in this graph structure are then regularly updated with learning techniques as data is continuously digested. Experiments demonstrate the use of dense and heterogeneous real-world data from a variety of personal and environmental sensors to monitor individual cardiovascular health state. State estimation and individual modeling is the fundamental basis to depart from disease-oriented approaches to a total health continuum paradigm. Precision in predicting health requires understanding state trajectory. By encasing this estimation within a navigational approach, a systematic guidance framework can plan actions to transition a current state towards a desired one. This work concludes by presenting this framework of combining the health state and personal graph model to perpetually plan and assist us in living life towards our goals.Comment: Ph.D. Dissertation @ University of California, Irvin
    • 

    corecore