39 research outputs found

    Automatic Detection of Intestinal Content to Evaluate Visibility in Capsule Endoscopy

    Full text link
    In capsule endoscopy (CE), preparation of the small bowel before the procedure is believed to increase visibility of the mucosa for analysis. However, there is no consensus on the best method of preparation, while comparison is difficult due to the absence of an objective automated evaluation method. The method presented here aims to fill this gap by automatically detecting regions in frames of CE videos where the mucosa is covered by bile, bubbles and remainders of food. We implemented two different machine learning techniques for supervised classification of patches: one based on hand-crafted feature extraction and Support Vector Machine classification and the other based on fine-tuning different convolutional neural network (CNN) architectures, concretely VGG-16 and VGG-19. Using a data set of approximately 40,000 image patches obtained from 35 different patients, our best model achieved an average detection accuracy of 95.15% on our test patches, which is similar to significantly more complex detection methods used for similar purposes. We then estimate the probabilities at a pixel level by interpolating the patch probabilities and extract statistics from these, both on per-frame and per-video basis, intended for comparison of different videos.This work was funded by the European Union’s H2020: MSCA: ITN program for the “Wireless In-body Environment Communication – WiBEC” project under the grant agreement no. 675353.Noorda, R.; Nevárez, A.; Colomer, A.; Naranjo, V.; Pons Beltrán, V. (2020). Automatic Detection of Intestinal Content to Evaluate Visibility in Capsule Endoscopy. IEEE. 163-168. https://doi.org/10.1109/ISMICT.2019.8743878S16316

    Automatic Bleeding Frame and Region Detection for GLCM Using Artificial Neural Network

    Get PDF
     Wireless capsule endoscopy is a device that inspects the direct visualization of patient’s gastrointestinal tract without invasiveness. Analyzing the WCE video is a time- consuming task hence computer aided technique is used to reduce the burden of medical clinicians. This paper proposes a novel color feature extraction method to detect the bleeding frame. First, we perform word based histogram for rapid bleeding detection in WCE images. Classification of bleeding WCE frame is performed by applying for glcm using  Artificial Neural Network and K-nearest neighbour method. Second we propose a two-stage saliency map extraction method. In first stage saliency, we inspect the bleeding images under different color components to highlight the bleeding regions. From second stage saliency red color in the bleeding frame reveals that the region is affected. Then, by using algorithm we fuse the two-stage of saliency to detect the bleeding area. Experimental results show that the proposed method is very efficient in detecting the bleeding frames and the region

    Automatic evaluation of degree of cleanliness in capsule endoscopy based on a novel CNN architecture

    Full text link
    [EN] Capsule endoscopy (CE) is a widely used, minimally invasive alternative to traditional endoscopy that allows visualisation of the entire small intestine. Patient preparation can help to obtain a cleaner intestine and thus better visibility in the resulting videos. However, studies on the most effective preparation method are conflicting due to the absence of objective, automatic cleanliness evaluation methods. In this work, we aim to provide such a method capable of presenting results on an intuitive scale, with a relatively light-weight novel convolutional neural network architecture at its core. We trained our model using 5-fold cross-validation on an extensive data set of over 50,000 image patches, collected from 35 different CE procedures, and compared it with state-of-the-art classification methods. From the patch classification results, we developed a method to automatically estimate pixel-level probabilities and deduce cleanliness evaluation scores through automatically learnt thresholds. We then validated our method in a clinical setting on 30 newly collected CE videos, comparing the resulting scores to those independently assigned by human specialists. We obtained the highest classification accuracy for the proposed method (95.23%), with significantly lower average prediction times than for the second-best method. In the validation of our method, we found acceptable agreement with two human specialists compared to interhuman agreement, showing its validity as an objective evaluation method.This work was funded by the European Union's H2020: MSCA: ITN program for the "Wireless In-body Environment Communication - WiBEC" project under the grant agreement no. 675353. Additionally, we gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan V GPU used for this research. Figures 2 and 3 were drawn by the authors.Noorda, R.; Nevárez, A.; Colomer, A.; Pons Beltrán, V.; Naranjo Ornedo, V. (2020). Automatic evaluation of degree of cleanliness in capsule endoscopy based on a novel CNN architecture. Scientific Reports. 10(1):1-13. https://doi.org/10.1038/s41598-020-74668-8S113101Pons Beltrán, V. et al. Evaluation of different bowel preparations for small bowel capsule endoscopy: a prospective, randomized, controlled study. Dig. Dis. Sci. 56, 2900–2905. https://doi.org/10.1007/s10620-011-1693-z (2011).Klein, A., Gizbar, M., Bourke, M. J. & Ahlenstiel, G. Validated computed cleansing score for video capsule endoscopy. Dig. Endosc. 28, 564–569. https://doi.org/10.1111/den.12599 (2016).Vilarino, F., Spyridonos, P., Pujol, O., Vitria, J. & Radeva, P. Automatic detection of intestinal juices in wireless capsule video endoscopy. In 18th International Conference on Pattern Recognition (ICPR’06), Vol. 4, 719–722, https://doi.org/10.1109/ICPR.2006.296 (2006).Wang, Q. et al. Reduction of bubble-like frames using a rss filter in wireless capsule endoscopy video. Opt. Laser Technol. 110, 152–157. https://doi.org/10.1016/j.optlastec.2018.08.051 (2019).Mewes, P. W. et al. Automatic region-of-interest segmentation and pathology detection in magnetically guided capsule endoscopy. In International Conference on Medical Image Computing and Computer-Assisted Intervention 141–148, https://doi.org/10.1007/978-3-642-23626-6_18 (Springer 2011).Bashar, M. K., Mori, K., Suenaga, Y., Kitasaka, T. & Mekada, Y. Detecting informative frames from wireless capsule endoscopic video using color and texture features. In Medical Image Computing and Computer-Assisted Intervention (MICCAI 2008), 603–610, https://doi.org/10.1007/978-3-540-85990-1_72 (Springer, Berlin, 2008).Sun, Z., Li, B., Zhou, R., Zheng, H. & Meng, M. Q. H. Removal of non-informative frames for wireless capsule endoscopy video segmentation. In 2012 IEEE International Conference on Automation and Logistics, 294–299, https://doi.org/10.1109/ICAL.2012.6308214 (2012).Khun, P. C., Zhuo, Z., Yang, L. Z., Liyuan, L. & Jiang, L. Feature selection and classification for wireless capsule endoscopic frames. In 2009 International Conference on Biomedical and Pharmaceutical Engineering, 1–6, https://doi.org/10.1109/ICBPE.2009.5384106 (2009).Segui, S. et al. Categorization and segmentation of intestinal content frames for wireless capsule endoscopy. IEEE Trans. Inf Technol. Biomed. 16, 1341–1352. https://doi.org/10.1109/TITB.2012.2221472 (2012).Maghsoudi, O. H., Talebpour, A., Soltanian-Zadeh, H., Alizadeh, M. & Soleimani, H. A. Informative and uninformative regions detection in wce frames. J. Adv. Comput. 3, 12–34. https://doi.org/10.7726/jac.2014.1002a (2014).Noorda, R., Nevarez, A., Colomer, A., Naranjo, V. & Pons, V. Automatic detection of intestinal content to evaluate visibility in capsule endoscopy. In 13th13^{th}International Symposium on Medical Information and Communication Technology (ISMICT 2019) (Oslo, Norway, 2019).Andrearczyk, V. & Whelan, P. F. Deep learning in texture analysis and its application to tissue image classification. In Biomedical Texture Analysis (eds Depeursinge, A. et al.) 95–129 (Elsevier, Amsterdam, 2017). https://doi.org/10.1016/B978-0-12-812133-7.00004-1.Werbos, P. J. et al. Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 1550–1560. https://doi.org/10.1109/5.58337 (1990).Jia, X. & Meng, M. Q.-H. A deep convolutional neural network for bleeding detection in wireless capsule endoscopy images. In 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 639–642, https://doi.org/10.1109/EMBC.2016.7590783 (IEEE, 2016).Simonyan, K. & Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556. https://doi.org/10.1109/ACPR.2015.7486599(2014).Springenberg, J. T., Dosovitskiy, A., Brox, T. & Riedmiller, M. Striving for simplicity: the all convolutional net. arXiv preprint arXiv:1412.6806 (2014).Chollet, F. et al. Keras (2015). Software available from keras.io.Abadi, M. et al. TensorFlow: large-scale machine learning on heterogeneous systems (2015). Software available from tensorflow.org.Beltrán, V. P., Carretero, C., Gonzalez-Suárez, B., Fernández-Urien, I. & Muñoz-Navas, M. Intestinal preparation prior to capsule endoscopy administration. World J. Gastroenterol. 14, 5773. https://doi.org/10.3748/wjg.14.5773 (2008).Koo, T. K. & Li, M. Y. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J. Chiropr. Med. 15, 155–163. https://doi.org/10.1016/j.jcm.2016.02.012 (2016).Cohen, J. Weighted kappa: nominal scale agreement provision for scaled disagreement or partial credit. Psychol. Bull. 70, 213. https://doi.org/10.1037/h0026256 (1968).Warrens, M. J. Conditional inequalities between Cohens kappa and weighted kappas. Stat. Methodol. 10, 14–22. https://doi.org/10.1016/j.stamet.2012.05.004 (2013).Sim, J. & Wright, C. C. The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys. Ther. 85, 257–268. https://doi.org/10.1093/ptj/85.3.257 (2005).Cardillo, G. Cohen’s kappa. https://www.github.com/dnafinder/Cohen (2020)

    Hookworm and Bleeding Detection in WCE Images using Rusboost Classifier

    Get PDF
    Now-a-days, million ranges of individuals are having helminthiasis and this number has been increasing day by day. Automatic hookworm recognition could be a difficult task in medical field. Here projected a completely unique technique for detective work the helminthiasis from wireless capsule examination (WCE) pictures. During this paper initial adopted for WCE image with sweetening method by mistreatment Multi-scale twin Matched Filter (MDMF). Then, Piecewise Parallel Region Detection (PPRD) is employed to discover the parallel edges. This technique is extremely appropriate for detective work hookworm when put next to different standard technique

    Frontiers of robotic endoscopic capsules: a review

    Get PDF
    Digestive diseases are a major burden for society and healthcare systems, and with an aging population, the importance of their effective management will become critical. Healthcare systems worldwide already struggle to insure quality and affordability of healthcare delivery and this will be a significant challenge in the midterm future. Wireless capsule endoscopy (WCE), introduced in 2000 by Given Imaging Ltd., is an example of disruptive technology and represents an attractive alternative to traditional diagnostic techniques. WCE overcomes conventional endoscopy enabling inspection of the digestive system without discomfort or the need for sedation. Thus, it has the advantage of encouraging patients to undergo gastrointestinal (GI) tract examinations and of facilitating mass screening programmes. With the integration of further capabilities based on microrobotics, e.g. active locomotion and embedded therapeutic modules, WCE could become the key-technology for GI diagnosis and treatment. This review presents a research update on WCE and describes the state-of-the-art of current endoscopic devices with a focus on research-oriented robotic capsule endoscopes enabled by microsystem technologies. The article also presents a visionary perspective on WCE potential for screening, diagnostic and therapeutic endoscopic procedures
    corecore