1,458 research outputs found

    Triagem robusta de melanoma : em defesa dos descritores aprimorados de nível médio

    Get PDF
    Orientadores: Eduardo Alves do Valle Junior, Sandra Eliza Fontes de AvilaDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: Melanoma é o tipo de câncer de pele que mais leva à morte, mesmo sendo o mais curável, se detectado precocemente. Considerando que a presença de um dermatologista em tempo integral não é economicamente viável para muitas cidades e especialmente em comunidades carentes, ferramentas de auxílio ao diagnóstico para a triagem do melanoma têm sido um tópico de pesquisa ativo. Muitos trabalhos existentes são baseados no modelo Bag-of-Visual-Words (BoVW), combinando descritores de cor e textura. No entanto, o modelo BoVW vem se aprimorando e hoje existem várias extensões que levam a melhores taxas de acerto em tarefas gerais de classificação de imagens. Estes modelos avançados ainda não foram explorados para rastreio de melanoma, motivando assim este trabalho. Aqui nós apresentamos uma nova abordagem para rastreio de melanoma baseado nos descritores BossaNova, que são estado-da-arte, mostrando resultados muito promissores, com uma AUC de 93,7%. Este trabalho também propõe uma nova estratégia de pooling espacial especialmente desenhada para rastreio de melanoma. Outra contribuição dessa pesquisa é o uso inédito do BossaNova na classificação de melanoma. Isso abre oportunidades de exploração deste descritor em outros contextos médicosAbstract: Melanoma is the type of skin cancer that most leads to death, even being the most curable, if detected early. Since the presence of a full time dermatologist is not economical feasible for many small cities and specially in underserved communities, computer-aided diagnosis for melanoma screening has been a topic of active research. Much of the existing art is based on the Bag-of-Visual-Words (BoVW) model, combining color and texture descriptors. However, the BoVW model has been improving and nowadays there are several extensions that perform better classification rates in general image classification tasks. These enhanced models were not explored yet for melanoma screening, thus motivating our work. Here we present a new approach for melanoma screening, based upon the state-of-the-art BossaNova descriptors, showing very promising results for screening, reaching an AUC of up to 93.7%. This work also proposes a new spatial pooling strategy specially designed for melanoma screening. Other contribution of this research is the unprecedented use of BossaNova in melanoma classification. This opens the opportunity to explore this enhanced mid-level descriptors in other medical contextsMestradoEngenharia de ComputaçãoMestre em Engenharia Elétric

    Ultrasound Guidance in Perioperative Care

    Get PDF

    Ultrasound Guidance in Perioperative Care

    Get PDF

    From bench to bedside - current clinical and translational challenges in fibula free flap reconstruction.

    Get PDF
    Fibula free flaps (FFF) represent a working horse for different reconstructive scenarios in facial surgery. While FFF were initially established for mandible reconstruction, advancements in planning for microsurgical techniques have paved the way toward a broader spectrum of indications, including maxillary defects. Essential factors to improve patient outcomes following FFF include minimal donor site morbidity, adequate bone length, and dual blood supply. Yet, persisting clinical and translational challenges hamper the effectiveness of FFF. In the preoperative phase, virtual surgical planning and artificial intelligence tools carry untapped potential, while the intraoperative role of individualized surgical templates and bioprinted prostheses remains to be summarized. Further, the integration of novel flap monitoring technologies into postoperative patient management has been subject to translational and clinical research efforts. Overall, there is a paucity of studies condensing the body of knowledge on emerging technologies and techniques in FFF surgery. Herein, we aim to review current challenges and solution possibilities in FFF. This line of research may serve as a pocket guide on cutting-edge developments and facilitate future targeted research in FFF

    Automated systems based on machine vision for inspecting citrus fruits from the field to postharvest - A review

    Full text link
    [EN] Computer vision systems are becoming a scientific but also a commercial tool for food quality assessment. In the field, these systems can be used to predict yield, as well as for robotic harvesting or the early detection of potentially dangerous diseases. In postharvest handling, it is mostly used for the automated inspection of the external quality of the fruits and for sorting them into commercial categories at very high speed. More recently, the use of hyperspectral imaging is allowing not only the detection of defects in the skin of the fruits but also their association to certain diseases of particular importance. In the research works that use this technology, wavelengths that play a significant role in detecting some of these dangerous diseases are found, leading to the development of multispectral imaging systems that can be used in industry. This article reviews recent works that use colour and non-standard computer vision systems for the automated inspection of citrus. It explains the different technologies available to acquire the images and their use for the non-destructive inspection of internal and external features of these fruits. Particular attention is paid to inspection for the early detection of some dangerous diseases like citrus canker, black spot, decay or citrus Huanglongbing.This work was supported by the Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria (INIA) through projects RTA2012-00062-C04-01 and RTA2012-00062-C04-03 with the support of European FEDER funds. The authors would like to thank and acknowledge the contributions that were made by all the students, postdocs, technicians and visiting scholars in the Precision Agriculture Laboratory at the University of Florida and the Computer Vision Laboratory at the Agricultural Engineering Centre of IVIA.Cubero García, S.; Lee, WS.; Aleixos Borrás, MN.; Albert Gil, FE.; Blasco Ivars, J. (2016). Automated systems based on machine vision for inspecting citrus fruits from the field to postharvest - A review. Food and Bioprocess Technology. 9(10):1623-1639. https://doi.org/10.1007/s11947-016-1767-1S16231639910Adebayo, S. E., Hashim, N., Abdan, K., & Hanafi, M. (2016). Application and potential of backscattering imaging techniques in agricultural and food processing—a review. Journal of Food Engineering, 169, 155–164.Aleixos, N., Blasco, J., Navarrón, F., & Moltó, E. (2002). Multispectral inspection of citrus in real time using machine vision and digital signal processors. Computers and Electronics in Agriculture, 33(2), 121–137.Annamalai, P., & Lee, W. S. (2003). Citrus yield mapping system using machine vision. ASAE Paper No. 031002. St. Joseph: ASAE.Annamalai, P., & Lee, W. S. (2004). Identification of green citrus fruits using spectral characteristics. ASAE Paper No. FL04–1001. St. Joseph: ASAE.Balasundaram, D., Burks, T. F., Bulanon, D. M., Schubert, T., & Lee, W. S. (2009). Spectral reflectance characteristics of citrus canker and other peel conditions of grapefruit. Postharvest Biology and Technology, 51, 220–226.Bansal, R., Lee, W. S., & Satish, S. (2013). Green citrus detection using fast Fourier transform (FFT) leakage. Precision Agriculture, 14(1), 59–70.Barreiro, P., Zheng, C., Sun, D.-W., Hernández-Sánchez, N., Pérez-Sánchez, J. M., & Ruiz-Cabello, J. (2008). Non-destructive seed detection in mandarins: comparison of automatic threshold methods in FLASH and COMSPIRA MRIs. Postharvest Biology and Technology, 47, 189–198.Basavaprasad, B., & Ravi, M. (2014). A comparative study on classification of image segmentation methods with a focus on graph based techniques. International Journal of Research in Engineering and Technology, 3, 310–315.Birth, G. S. (1976). How light interacts with foods. In: Gafney J.Jr.(Ed.), Quality detection in foods (pp. 6–11). St. Joseph: ASAE.Blanc, P.G.R., Blasco, J., Moltó, E., Gómez-Sanchis, J., & Cubero, S. (2010) System for the automatic selective separation of rotten citrus fruits. Patent number EP2133157 A1 CN101678405A, EP2133157A4, EP2133157B1, US20100121484Blasco, J., Aleixos, N., & Moltó, E. (2007a). Computer vision detection of peel defects in citrus by means of a region oriented segmentation algorithm. Journal of Food Engineering, 81(3), 535–543.Blasco, J., Aleixos, N., Gómez, J., & Moltó, E. (2007b). Citrus sorting by identification of the most common defects using multispectral computer vision. Journal of Food Engineering, 83(3), 384–393.Blasco, J., Aleixos, N., Gómez-Sanchis, J., & Moltó, E. (2009). Recognition and classification of external skin damages in citrus fruits using multispectral data and morphological features. Biosystems Engineering, 103(2), 137–145.Blasco, J., Cubero, S., & Moltó, E. (2016). Quality evaluation of citrus fruits. In D.-W. Sun (Ed.), Computer vision technology for food quality evaluation (2nd ed.). San Diego: Academic Press.Bulanon, D. M., Burks, T. F., & Alchanatis, V. (2009). Image fusion of visible and thermal images for fruit detection. Biosystems Engineering, 103, 12–22.Bulanon, D.M., Burks, T.F., Kim, D.G., & Ritenour, M.A. (2013). Citrus black spot detection using hyperspectral image analysis. Agricultural Engineering International: CIGR Journal, 15,(3)171.Burks, T. F., Villegas, F., Hannan, M. W., & Flood, S. (2003). Engineering and horticultural aspects of robotic fruit harvesting: opportunities and constraints. HortTechnology, 15(1), 79–87.Campbell, B. L., Nelson, R. G., Ebel, R. C., Dozier, W. A., Adrian, J. L., & Hockema, B. R. (2004). Fruit quality characteristics that affect consumer preferences for Satsuma mandarins. Hortscience, 39(7), 1664–1669.Chinchuluun, R., Lee, W. S., & Ehsani, R. (2009). Machine vision system for determining citrus count and size on a canopy shake and catch harvester. Applied Engineering in Agriculture, 25(4), 451–458.Choi, D., Lee, W. S., Ehsani, R., & Roka, F. M. (2015). A machine vision system for quantification of citrus fruit dropped on the ground under the canopy. Transactions of the ASABE, 58(4), 933–946.Codex Alimentarius, (2011). Codex standard for oranges. Available at: http://www.codexalimentarius.org/download/standards/10372/CXS_245e.pdf . Accessed March 2016Cubero, S., Aleixos, N., Albert, A., Torregrosa, A., Ortiz, C., García-Navarrete, O., & Blasco, J. (2014a). Optimised computer vision system for automatic pre-grading of citrus fruit in the field using a mobile platform. Precision Agriculture, 15(1), 80–94.Cubero, S., Aleixos, N., Moltó, E., Gómez-Sanchis, J., & Blasco, J. (2011). Advances in machine vision applications for automatic inspection and quality evaluation of fruits and vegetables. Food and Bioprocess Technology, 4(4), 487–504.Cubero, S., Diago, M. P., Blasco, J., Tardáguila, J., Millán, B., & Aleixos, N. (2014b). A new method for pedicel/peduncle detection and size assessment of grapevine berries and other fruits by image analysis. Biosystems Engineering, 117, 62–72.Dong, C.-W., Ye, Y., Zhang, J.-Q., Zhu, H.-K., & Liu, F. (2014). Detection of thrips defect on green-peel citrus using hyperspectral imaging technology combining PCA and B-Spline lighting correction method. Journal of Integrative Agriculture, 13(10), 2229–2235.FAOSTAT (2012). URL: http://faostat.fao.org http://www.fao.org/fileadmin/templates/est/COMM_MARKETS_MONITORING/Citrus/Documents/CITRUS_BULLETIN_2012.pdf . Accessed March 2016.Farrell, T. J., Patterson, M. S., & Wilson, B. (1992). A diffusion-theory model of spatially resolved steady-state diffuse reflectance for the noninvasive determination of tissue optical-properties in vivo. Medical Physics, 19, 879–888.Flood, S. J., Burks, T. F., & Teixeira, A. A. (2006). Physical properties of oranges in response to applied gripping forces for robotic harvesting. Transactions of ASAE, 49(2), 341–346.Gaffney, J. J. (1973). Reflectance properties of citrus fruit. Transactions of ASAE, 16(2), 310–314.Garcia-Ruiz, F., Sankaran, S., Maja, J. M., Lee, W. S., Rasmussen, J., & Ehsani, R. (2013). Comparison of two aerial imaging platforms for identification of Huanglongbing infected citrus trees. Computers and Electronics in Agriculture, 91, 106–115.Gómez, J., Blasco, J., Moltó, E., & Camps-Valls, G. (2007). Hyperspectral detection of citrus damage with a Mahalanobis kernel classifier. Electronics Letters, 43(20), 1082–1084.Gómez-Sanchis, J., Blasco, J., Soria-Olivas, E., Lorente, D., Escandell-Montero, P., Martínez-Martínez, J. M., Martínez-Sober, M., & Aleixos, N. (2013). Hyperspectral LCTF-based system for classification of decay in mandarins caused by Penicillium digitatum and Penicillium italicum using the most relevant bands and non-linear classifiers. Postharvest Biology and Technology, 82, 76–86.Gómez-Sanchis, J., Gómez-Chova, L., Aleixos, N., Camps-Valls, G., Montesinos-Herrero, C., Moltó, E., & Blasco, J. (2008). Hyperspectral system for early detection of rottenness caused by Penicillium digitatum in mandarins. Journal of Food Engineering, 89(1), 80–86.Gómez-Sanchis, J., Lorente, D., Soria-Olivas, E., Aleixos, N., Cubero, S., & Blasco, J. (2014). Development of a hyperspectral computer vision system based on two liquid crystal tuneable filters for fruit inspection. Application to detect citrus fruits decay. Food and Bioprocess Technology, 7, 1047–1056.Gómez-Sanchis, J., Martín-Guerrero, J. D., Soria-Olivas, E., Martínez-Sober, M., Magdalena-Benedito, R., & Blasco, J. (2012). Detecting rottenness caused by Penicillium in citrus fruits using machine learning techniques. Expert Systems with Applications, 39(1), 780–785.Gong, A., Yu, J., He, Y., & Qiu, Z. (2013). Citrus yield estimation based on images processed by an android mobile phone. Biosystems Engineering, 115, 162–170.Gottwald, T. R., Graham, J. H., & Schubert, T. S. (2002). Citrus canker: the pathogen and its impact. Plant Health Progress. doi: 10.1094/PHP-2002-0812-01-RV.Hannan, M., Burks, T. F., & Bulanon, D.M. (2009). A machine vision algorithm for orange fruit detection. The CIGR Ejournal. Manuscript 1281. Vol XI. December 2009.Harrell, R. C., Adsit, P. D., & Slaughter, D. C. (1985). Real-time vision-servoing of a robotic tree-fruit harvester. ASAE Paper No (pp. 85–3550). St. Joseph: ASAE.Hernández-Sánchez, N., Barreiro, P., & Ruiz-Cabello, J. (2006). On-line identification of seeds in mandarins with magnetic resonance imaging. Biosystems Engineering, 95, 529–536.Holmes, G. J., & Eckert, J. W. (1999). Sensitivity of Penicillium digitatum and P. italicum to postharvest citrus fungicides in California. Phytopathology, 89(9), 716–721.Iqbal, S. M., Gopal, A., Sankaranarayanan, P. E., & Nair, A. B. (2016). Classification of selected citrus fruits based on color using machine vision system. International Journal of Food Properties, 19, 272–288.Jackson, J. E. (1991). A user’s guide to principal components. New York: Wiley.Jafari, A., Fazayeli, A., & Zarezadeh, M. R. (2014). Estimation of orange skin thickness based on visual texture coarseness. Biosystems Engineering, 117, 73–82.Jiménez-Cuesta, M. J., Cuquerella, J., & Martínez-Jávega, J. M. (1981). Determination of a color index for citrus fruit degreening. In Proceedings of the International Society of Citriculture, 2, 750–753.Kim, D. G., Burks, T. F., Qin, J., & Bulanon, D. M. (2009). Classification of grapefruit peel diseases using color texture feature analysis. International Journal of Agricultural and Biological Engineering, 2, 41–50.Kim, D. G., Burks, T. F., Ritenour, M. A., & Qin, J. (2014). Citrus black spot detection using hyperspectral imaging. International Journal of Agricultural and Biological Engineering, 7, 20–27.Kohno, Y., Kondo, N., Iida, M., Kurita, M., Shiigi, T., Ogawa, Y., Kaichi, T., & Okamoto, S. (2011). Development of a mobile grading machine for citrus fruit. Engineering in Agriculture, Environment and Food, 4, 7–11.Kondo, N., Kuramoto, M., Shimizu, H., Ogawa, Y., Kurita, M., Nishizu, T., Chong, V. K., & Yamamoto, K. (2009). Identification of fluorescent substance in mandarin orange skin for machine vision system to detect rotten citrus fruits. Engineering in Agriculture, Environment and Food, 2, 54–59.Kurita, M., Kondo, N., Shimizu, H., Ling, P. P., Falzea, P. D., Shiigi, T., Ninomiya, K., Nishizu, T., & Yamamoto, K. (2009). A double image acquisition system with visible and UV LEDs for citrus fruit. Journal of Robotics and Mechatronics, 21, 533–540.Kurtulmus, F., Lee, W. S., & Vardar, A. (2011). Green citrus detection using eigenfruit, color and circular Gabor texture features under natural outdoor conditions. Computers and Electronics in Agriculture, 78(2), 140–149.Ladaniya, M. S. (2010). Citrus fruit: biology, technology and evaluation. San Diego: Academic Press.Li, H., Lee, W. S., & Wang, K. (2016). Immature green citrus fruit detection and counting based on fast normalized cross correlation (FNCC) using natural outdoor colour images. Precision Agriculture. doi: 10.1007/s11119-016-9443-z.Li, H., Lee, W. S., Wang, K., Ehsani, R., & Yang, C. (2014). Extended spectral angle mapping (ESAM) for citrus greening disease detection using airborne hyperspectral imaging. Precision Agriculture, 15, 162–183.Li, J., Rao, X., & Ying, Y. (2011). Detection of common defects on oranges using hyperspectral reflectance imaging. Computers and Electronics in Agriculture, 78, 38–48.Li, J., Rao, X., & Ying, Y. (2012a). Development of algorithms for detecting citrus canker based on hyperspectral reflectance imaging. Journal of the Science of Food and Agriculture, 92, 125–134.Li, J., Rao, X., Wang, F., Wu, W., & Ying, Y. (2013). Automatic detection of common surface defects on oranges using combined lighting transform and image ratio methods. Postharvest Biology and Technology, 82, 59–69.Li, J., Rao, X., Ying, Y., & Wang, D. (2010). Detection of navel oranges canker based on hyperspectral imaging technology. Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 26, 222–228.Li, X., Lee, W. S., Li, M., Ehsani, R., Mishra, A., Yang, C., & Mangan, R. (2012b). Spectral difference analysis and airborne imaging classification for citrus greening infected trees. Computers and Electronics in Agriculture, 83, 32–46.Li, X., Lee, W. S., Li, M., Ehsani, R., Mishra, A. R., Yang, C., & Mangan, R. L. (2015). Feasibility study on Huanglongbing (citrus greening) detection based on WorldView-2 satellite imagery. Biosystems Engineering, 132, 28–38.Lopes, L. B., VanDeWall, H., Li, H. T., Venugopal, V., Li, H. K., Naydin, S., Hosmer, J., Levendusky, M., Zheng, H., Bentley, M. V., Levin, R., & Hass, M. A. (2010). Topical delivery of lycopene using microemulsions: enhanced skin penetration and tissue antioxidant activity. Journal of Pharmaceutical Sciences, 99, 1346–1357.López, J. J., Cobos, M., & Aguilera, E. (2011). Computer-based detection and classification of flaws in citrus fruits. Neural Computing and Applications, 20, 975–981.López-García, F., Andreu, G., Blasco, J., Aleixos, N., & Valiente, J. M. (2010). Automatic detection of skin defects in citrus fruits using a multivariate image analysis approach. Computers and Electronics in Agriculture, 71, 189–197.Lorente, D., Aleixos, N., Gómez-Sanchis, J., Cubero, S., & Blasco, J. (2013a). Selection of optimal wavelength features for decay detection in citrus fruit using the ROC curve and neural networks. Food and Bioprocess Technology, 6(2), 530–541.Lorente, D., Aleixos, N., Gómez-Sanchis, J., Cubero, S., García-Navarrete, O. L., & Blasco, J. (2012). Recent advances and applications of hyperspectral imaging for fruit and vegetable quality assessment. Food and Bioprocess Technology, 5(4), 1121–1142.Lorente, D., Blasco, J., Serrano, A. J., Soria-Olivas, E., Aleixos, N., & Gómez-Sanchis, J. (2013b). Comparison of ROC feature selection method for the detection of decay in citrus fruit using hyperspectral images. Food and Bioprocess Technology, 6(12), 3613–3619.Lorente, D., Zude, M., Regen, C., Palou, L., Gómez-Sanchis, J., & Blasco, J. (2013c). Early decay detection in citrus fruit using laser-light backscattering imaging. Postharvest Biology and Technology, 86, 424–430.Lorente, D., Zude, M., Idler, C., Gómez-Sanchis, J., & Blasco, J. (2015). Laser-light backscattering imaging for early decay detection in citrus fruit using both a statistical and a physical model. Journal of Food Engineering, 154, 76–85.Maf Industries. (2016). VIOTEC brochure. http://mafindustries.com/wp-content/uploads/2015/02/viotec3.pdf . Accessed March 2016.Magwaza, L. S., Opara, U. L., Nieuwoudt, H., Cronje, P. J. R., Saeys, W., & Nicolaï, B. (2012). NIR spectroscopy applications for internal and external quality analysis of citrus fruit—a review. Food and Bioprocess Technology, 5(2), 425–444.Mehta, S. S., & Burks, T. F. (2014). Vision-based control of robotic manipulator for citrus harvesting. Computers and Electronics in Agriculture, 102, 146–158.Moltó, E., Blasco, J., & Gómez-Sanchis, J. (2010). Analysis of hyperspectral images of citrus fruits. In D.-W. Sun (Ed.), Hyperspectral imaging for food quality analysis and control (pp. 321–348). California: Academic Press.Moltó, E., Plá, F., & Juste, F. (1992). Vision systems for the location of citrus fruit in a tree canopy. Journal of Agricultural Engineering Research, 52, 101–110.Momin, A., Kondo, N., Kuramoto, M., Ogawa, Y., Yamamoto, K., & Shiigi, T. (2012). Investigation of excitation wavelength for fluorescence emission of citrus peels based on UV-VIS spectra. Engineering in Agriculture, Environment and Food, 5, 126–132.Momin, A., Kondo, N., Ogawa, Y., Ido, K., & Ninomiya, K. (2013b). Patterns of fluorescence associated with citrus peel defects. Engineering in Agriculture, Environment and Food, 6, 54–60.Momin, A., Kuramoto, M., Kondo, N., Ido, K., Ogawa, Y., Shiigi, T., & Ahmad, U. (2013a). Identification of UV-fluorescence components for detecting peel defects of lemon and yuzu using machine vision. Engineering in Agriculture, Environment and Food, 6, 165–171.Morgan, S. P., & Stockford, I. M. (2003). Surface-reflection elimination in polarization imaging of superficial tissue. Optics Letters, 28, 114–116.Niphadkar, N. P., Burks, T. F., Qin, J., & Ritenour, M. (2013b). Edge effect compensation for citrus canker lesion detection due to light source variation—a hyperspectral imaging application. Agricultural Engineering International: CIGR Journal, 15, 314–327.Niphadkar, N. P., Burks, T. F., Qin, J. W., & Ritenour, M. A. (2013a). Estimation of citrus canker lesion size using hyperspectral reflectance imaging. International Journal of Agricultural and Biological Engineering, 6, 41–51.Obenland, D., Margosan, D., Smilanick, J. L., & Mackey, B. (2010). Ultraviolet fluorescence to identify navel oranges with poor peel quality and decay. HortTechnology, 20, 991–995.Ogawa, Y., Abdul, M. M., Kuramoto, M., Kohno, Y., Shiigi, T., Yamamoto, K., & Kondo, K. (2011). Rotten part detection on citrus fruit surfaces by use of fluorescent images. The Review of Laser Engineering, 394, 255–261.Okamoto, H., & Lee, W. S. (2009). Green citrus detection using hyperspectral imaging. Computers and Electronics in Agriculture, 66(2), 201–208.Omid, M., Khojastehnazhand, M., & Tabatabaeefar, A. (2010). Estimating volume and mass of citrus fruits by image processing technique. Journal of Food Engineering, 100, 315–321.Ottavian, M., Barolo, M., & García-Muñoz, S. (2013). Maintenance of machine vision systems for product quality assessment. Part I. Addressing changes in lighting conditions. Industrial & Engineering Chemistry Research, 52, 12309–12318.Ottavian, M., Barolo, M., & García-Muñoz, S. (2014). Maintenance of machine vision systems for product quality assessment. Part II. Addressing camera replacement. Industrial & Engineering Chemistry Research, 53, 1529–1536.Palou, L. (2014). Penicillium digitatum, Penicillium italicum (green mold, blue mold). In S. Bautista-Baños (Ed.), Postharvest decay. Control strategies. London: Elsevier.Palou, L., Smilanick, J. L., Montesinos-Herrero, C., Valencia-Chamorro, S., & Pérez-Gago, M. B. (2011). Novel approaches for postharvest preservation of fresh citrus fruits. In Slaker (Ed.), Citrus fruits: properties, consumption and nutrition. New York: Nova Science Publishers, Inc..Pongnumkul, S., Chaovalit, P., & Surasvadi, N. (2015). Applications of smartphone-based sensors in agriculture: a systematic review of research. Journal of Sensors, Open Access Article ID 195308.Pourreza, A., Lee, W. S., Ehsani, R., Schueller, J. K., & Raveh, E. (2015a). An optimum method for real-time in-field detection of Huanglongbing disease using a vision sensor. Computers and Electronics in Agriculture, 110, 221–232.Pourreza, A., Lee, W. S., Etxeberria, E., & Banerjee, A. (2015b). An evaluation of a vision based sensor performance in Huanglongbing disease identification. Biosystems Engineering, 130, 13–22.Qin, J., Burks, T. F., Kim, M. S., Chao, K., & Ritenour, M. A. (2008). Citrus canker detection using hyperspectral reflectance imaging and PCA-based image classification method. Sensing and Instrumentation for Food Quality and Safety, 2(3), 168–177.Qin, J., Burks, T. F., Ritenour, M. A., & Gordon Bonn, W. (2009). Detection of citrus canker using hyperspectral reflectance imaging with spectral information divergence. Journal of Food Engineering, 93, 183–191.Qin, J., Burks, T. F., Zhao, X., Niphadkar, N., & Ritenour, M. A. (2011). Multispectral detection of citrus canker using hyperspectral band selection. Transactions of the ASABE, 54, 2331–2341.Qin, J., Burks, T. F., Zhao, X., Niphadkar, N., & Ritenour, M. A. (2012). Development of a two-band spectral imaging system for real-time citrus canker detection. Journal of Food Engineering, 108, 87–93.Sengupta, S., & Lee, W. S. (2014). Identification and determination of the number of immature green citrus fruit under different ambient light conditions.

    Diagnostic assistance to improve acute burn referral and triage : assessment of routine clinical tools at specialised burn centres and potential for digital health development at point of care

    Get PDF
    Background: Inappropriate referral of patients for specialised care leads to overburdened health systems and improper treatment of patients who are denied transfer due to a scarcity of resources. Burn injuries are a global health problem where specialised care is particularly important for severe cases while minor burns can be treated at point of care. Whether several solutions, existing or in development, could be used to improve the diagnosis, referral and triage of acute burns at admission to specialised burn centres remains to be evaluated. Aim: The overarching aim of this thesis is to determine the potential of diagnostic support tools for referral and triage of acute burns injuries. More specifically, sub-aims include the assessment of routine and digital health tools utilised in South Africa and Sweden: referral criteria, mortality prediction scores, image-based remote consultation and automated diagnosis. Methods: Studies I and II were two retrospective studies of patients admitted to the paediatric (I) and the adult (II) specialised burn centres of the Western Cape province in South Africa. Study I examined adherence to referral criteria at admission of 1165 patients. Logistic regression was performed to assess the associations between adherence to the referral criteria and patient management at the centre. Study II assessed mortality prediction at admission of 372 patients. Logistic regression was performed to evaluate associations between patient, injury and admission-related characteristics with mortality. The performance of an existing mortality prediction model (the ABSI score) was measured. Study III and IV were related to two image-based digital-health tools for remote diagnosis. In Study III, 26 burns experts provided a diagnosis in terms of burn size and depth for 51 images of acute burn cases using their smartphone or tablet. Diagnostic accuracy was measured with intraclass correlation coefficient. In Study IV, two deep-learning algorithms were developed using 1105 annotated acute burn images of cases collected in South Africa and Sweden. The first algorithm identifies a burn area from healthy skin, and the second classifies burn depth. Differences in performances by patient Fitzpatrick skin types were also measured. Results: Study I revealed a 93.4% adherence to the referral criteria at admission. Children older than two years (not fulfilling the age criterion) as well as those fulfilling the severity criterion were more likely to undergo surgery or stay longer than seven days at the centre. At the adult burn centre (Study II), mortality affected one in five patients and was associated with gender, burn size, and referral status after adjustments for all other variables. The ABSI score was a good estimate of mortality prediction. In Study III experts were able to accurately diagnose burn size, and to a lesser extent depth, using handheld devices. A wound identifier and a depth classifier algorithm could be developed with assessments of relatively high accuracy (Study IV). Differences were observed in performances by skin types of the patients. Conclusions: Altogether the findings inform on the use in clinical practice of four different tools that could improve the accuracy of the diagnosis, referral and triage of patients with acute burns. This would reduce inequities in access to care by improving access for both paediatric and adult patient populations in settings that are resource scarce, geographically distant or under high clinical pressure

    Artificial intelligence for ultrasound scanning in regional anaesthesia: a scoping review of the evidence from multiple disciplines

    Get PDF
    Background Artificial intelligence (AI) for ultrasound scanning in regional anaesthesia is a rapidly developing interdisciplinary field. There is a risk that work could be undertaken in parallel by different elements of the community but with a lack of knowledge transfer between disciplines, leading to repetition and diverging methodologies. This scoping review aimed to identify and map the available literature on the accuracy and utility of AI systems for ultrasound scanning in regional anaesthesia. Methods A literature search was conducted using Medline, Embase, CINAHL, IEEE Xplore, and ACM Digital Library. Clinical trial registries, a registry of doctoral theses, regulatory authority databases, and websites of learned societies in the field were searched. Online commercial sources were also reviewed. Results In total, 13,014 sources were identified; 116 were included for full-text review. A marked change in AI techniques was noted in 2016–17, from which point on the predominant technique used was deep learning. Methods of evaluating accuracy are variable, meaning it is impossible to compare the performance of one model with another. Evaluations of utility are more comparable, but predominantly gained from the simulation setting with limited clinical data on efficacy or safety. Study methodology and reporting lack standardisation. Conclusions There is a lack of structure to the evaluation of accuracy and utility of AI for ultrasound scanning in regional anaesthesia, which hinders rigorous appraisal and clinical uptake. A framework for consistent evaluation is needed to inform model evaluation, allow comparison between approaches/models, and facilitate appropriate clinical adoption

    Segmentation and classification of skin lesions using hybrid deep learning method in the Internet of Medical Things

    Get PDF
    Introduction Particularly within the Internet of Medical Things (IoMT) context, skin lesion analysis is critical for precise diagnosis. To improve the accuracy and efficiency of skin lesion analysis, CAD systems play a crucial role. To segment and classify skin lesions from dermoscopy images, this study focuses on using hybrid deep learning techniques. Method This research uses a hybrid deep learning model that combines two cutting-edge approaches: Mask Region-based Convolutional Neural Network (MRCNN) for semantic segmentation and ResNet50 for lesion detection. To pinpoint the precise location of a skin lesion, the MRCNN is used for border delineation. We amass a huge, annotated collection of dermoscopy images for thorough model training. The hybrid deep learning model to capture subtle representations of the images is trained from start to finish using this dataset. Results The experimental results using dermoscopy images show that the suggested hybrid method outperforms the current state-of-the-art methods. The model's capacity to segment lesions into distinct groups is demonstrated by a segmentation accuracy measurement of 95.49 percent. In addition, the classification of skin lesions shows great accuracy and dependability, which is a notable advancement over traditional methods. The model is put through its paces on the ISIC 2020 Challenge dataset, scoring a perfect 96.75% accuracy. Compared to current best practices in IoMT, segmentation and classification models perform exceptionally well. Conclusion In conclusion, this paper's hybrid deep learning strategy is highly effective in skin lesion segmentation and classification. The results show that the model has the potential to improve diagnostic accuracy in the setting of IoMT, and it outperforms the current gold standards. The excellent results obtained on the ISIC 2020 Challenge dataset further confirm the viability and superiority of the suggested methodology for skin lesion analysis.© 2023 The Authors. Skin Research and Technology published by John Wiley & Sons Ltd. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.fi=vertaisarvioitu|en=peerReviewed
    • …
    corecore