54 research outputs found

    Predicting Drusen Regression from OCT in Patients with Age-Related Macular Degeneration

    Get PDF
    Age-related macular degeneration (AMD) is a leading cause of blindness in developed countries. The presence of drusen is the hallmark of early/intermediate AMD, and their sudden regression is strongly associated with the onset of late AMD. In this work we propose a predictive model of drusen regression using optical coherence tomography (OCT) based features. First, a series of automated image analysis steps are applied to segment and characterize individual drusen and their development. Second, from a set of quantitative features, a random forest classifiser is employed to predict the occurrence of individual drusen regression within the following 12 months. The predictive model is trained and evaluated on a longitudinal OCT dataset of 44 eyes from 26 patients using leave-one-patient-out cross-validation. The model achieved an area under the ROC curve of 0.81, with a sensitivity of 0.74 and a specificity of 0.73. The presence of hyperreflective foci and mean drusen signal intensity were found to be the two most important features for the prediction. This preliminary study shows that predicting drusen regression is feasible and is a promising step toward identification of imaging biomarkers of incoming regression

    EVALUATION OF SOILS IN SPLIT-DALMATIA COUNTY FOR THE NEEDS OF IRRIGATION

    Get PDF
    Istraživanje i vrjednovanje tala Splitsko-dalmatinske županije jedno je od brojnih u državnoj strategiji izrade planova natapanja. Sve veća suša u razvoju poljoprivrede utječe na proizvodnost poljoprivredne proizvodnje u tom mediteranskom okružju. Temeljni cilj u ovom radu je bio sagledati zemljišne resurse pogodne za natapanje i izdvojiti oranične površine gdje bi ta hidrotehnička mjera dala najveće rezultate. Osnovna metoda vrjednovanja je procjena pogodnosti tala za potrebe natapanja (FAO, 1976; Vidaček, 1981). Analiza je izvršena pomoću GIS tehnologije, program ArcView. Na temelju analize pedoloških istraživanja semidetaljnog karaktera utvrdili smo sljedeće: Splitsko-dalmatinska županija ima površinu od 4.539 km2. Od toga na poljoprivredne površine otpada 2.177 km2 ili 48%. Šume su po površini najzastupljenije i zauzimaju 2.244 km2 ili 49,4%. Naselja i okućnice zauzimaju 90 km2 ili 2,0%. Vodene površine zauzimaju 29 km2 ili 0,6%. Najzastupljenija tla županije su smeđa tla na vapnencu ili dolomitu s 52.485 ha i vapneno dolomitna crnica s 42.027 ha, koja zbog stjenovitosti i nagiba spadaju u trajno nepogodna tla za natapanje. Od pogodnih tala najzastupljenija su antropogena tla u raznim oblicima krša (42.258 ha). Najbolja tla za oranice i vrtove su hidromeliorirana (3.649 ha) i aluvijalna (3.068 ha), a tla koja u tom kraju potencijalno najviše vrijede su močvarno glejna tla (3.337 ha). Pogodnih tala prvog prioriteta za natapanje s potrebnim većim ili manjim mjerama agromelioracija ima 92.003 ha, a drugog prioriteta za hidro i/ili agromelioracije u primjeni natapanja 6.001 ha. Trajno nepogodnih tala za natapanje u okviru poljoprivrednih površina (prvenstveno pašnjaci i livade) ima 119.663 ha.Investigation and evaluation of soils in Split-Dalmatia County are among numerous such investigations in the national strategy for planning irrigation. Increasingly frequent droughts affect agricultural production in this Mediterranean region. The principal aim of this work was to review the land resources suitable for irrigation and separate cropping areas where this hydrotechnical measure would render optimal results. The basic evaluation method is assessment of soil suitability for irrigation (FAO, 1976; Vidaček, 1981). Analysis was done using the GIS technology, program ArcView. Pedological investigations of semi-detailed character provided the following data: Split-Dalmatia County has an area of 4,539 km2. In this number, agricultural areas account for 2,177 km2 or 48%. Forests cover the largest area of 2,244 km2 or 49.4%. Settlements and house lots occupy 90 km2 or 2.0%. Water areas cover 29 km2 or 0.6%. The most wide-spread soils in this county are calcocambisols with 52,485 ha and calcomelanosols with 42,027 ha, which due to their rockiness and slope are permanently not suitable for irrigation. The most frequent suitable soils are anthropogenic soils in various karst forms (42,258 ha). Best soils for plough-fields and gardens are hydroameliorated (3,649 ha) and alluvial (3,068 ha) soils, while the potentially most valuable soils in the region are gley amphigley soils (3,337 ha). There are 92,003 ha of suitable soils of the first priority for irrigation requiring larger or smaller agroamelioration measures, and 6,001 ha of the second priority for hydro and/or agroamelioration when applying irrigation. There are 119,663 ha of permanently not suitable soils for irrigation within agricultural areas (primarily pastures and meadows)

    Modeling Disease Progression In Retinal OCTs With Longitudinal Self-Supervised Learning

    Full text link
    Longitudinal imaging is capable of capturing the static ana\-to\-mi\-cal structures and the dynamic changes of the morphology resulting from aging or disease progression. Self-supervised learning allows to learn new representation from available large unlabelled data without any expert knowledge. We propose a deep learning self-supervised approach to model disease progression from longitudinal retinal optical coherence tomography (OCT). Our self-supervised model takes benefit from a generic time-related task, by learning to estimate the time interval between pairs of scans acquired from the same patient. This task is (i) easy to implement, (ii) allows to use irregularly sampled data, (iii) is tolerant to poor registration, and (iv) does not rely on additional annotations. This novel method learns a representation that focuses on progression specific information only, which can be transferred to other types of longitudinal problems. We transfer the learnt representation to a clinically highly relevant task of predicting the onset of an advanced stage of age-related macular degeneration within a given time interval based on a single OCT scan. The boost in prediction accuracy, in comparison to a network learned from scratch or transferred from traditional tasks, demonstrates that our pretrained self-supervised representation learns a clinically meaningful information.Comment: Accepted for publication in the MICCAI 2019 PRIME worksho

    WATER DEFICIT ANALYSIS IN VARIOUS CROPS

    Get PDF
    U radu se razmatra manjak vode u tlu za područje Zagreba za obradiva tla (tipa: aluvij i semiglej) te za najčešće uzgajane poljoprivredne kulture (silažni kukuruz, kupus, paprika, salata, rajčica, kukuruz, jabuka itd.). Pogodnost tala je određena FAO metodom. Klimatološke karakteristike područja s aspekta navodnjavanja određene su uporabom 20-godišnjih nizova meteoroloških podataka i pedoloških podataka te indeksa faza razvoja biljaka. Referentna evapotranspiracija (ETo) izračunata je prema metodi Penman-Montheitha. Efektivne oborine izračunate su metodom USBR iz vrijednosti srednjih mjesečnih prosječnih oborina kao i iz količine mjesečnih oborina koje su razgraničene donjim kvartilom. Bilanca vode u tlu za svaku kulturu izračunata je prema metodi Palmera. Dobiveni rezultati pokazuju da su potrebe kultura za vodom različite, a ukupni nedostatak vode ovisi o količini i sezonskom rasporedu oborina i hidropedološkim značajkama tala. Za svaku kulturu izračunat je manjak vode za višegodišnji prosjek oborina i za sušne mjesece, koji su razgraničeni donjim kvartilom količine oborine. Može se zaključiti da je navodnjavanje potrebna mjera u uzgoju navedenih kultura na navedenim tipovima tala na području Zagreba.The paper deals with water a deficit in arable soils of the Zagreb region (types: calacaric fluvisol and semigley) and for the most common agricultural crops grown (silage maize, cabbage, bell pepper, lettuce, tomato, maize, apple, etc.). Soil suitability was assessed by the FAO method. Climatological characteristics of the region, from the aspect of irrigation, were assessed using 20-year series of meteorological and pedological data, and indices of plant development stages. Reference evapotranspiration (ETo) was calculated by the Penman-Monteith method. Effective precipitation was calculated by the USBR method from the values of mean monthly average precipitation as well as from the amounts of monthly precipitation, separated by the lower quartile. Palmer’s method was used to calculate soil water balance for each crop. The results show that different crops have different water requirements, the total water deficit depending on the amount and seasonal distribution of precipitation and on the soil hydropedological characteristics. The water deficit was calculated for each crop with respect to the long-term precipitation average as well as for droughty months, separated by the lower quartile of the precipitation amount. It is concluded that irrigation is a necessary measure for the production of the studied crops on the said soil types in the Zagreb region

    Gastrointestinal parasites in owned dogs in Serbia: Prevalence and risk factors

    Get PDF
    Dogs are the most popular pets worldwide. Close contact between dogs and people increases the risk of transmission of various zoonotic parasitic infections. Given the importance of veterinary medicine in preserving the One Health concept, the aim of this research was to identify intestinal parasites that may have zoonotic potential and to evaluate risk factors (individual and environmental). The research was conducted in Serbia in 2022 and 2023 on 382 owned dogs, using qualitative methods of coprological examination with a concentration on parasitic elements. The overall prevalence of intestinal parasites was 62.6%, with the following detected: protozoa: Cystoisospora spp. (9.2%), Sarcocystis spp. (4.5%), Neospora caninum/Hammondia spp. (3.7%), Giardia intestinalis (11.8%); nematoda: Toxocara canis (11.5%), Toxascaris leonina (4.2%), family Ancylostomatidae (38.0%), Trichuris vulpis (21.5%), Capillaria spp. (10.5%); trematoda: Alaria alata (1.6%) and cestodes from the Taeniidae family (1.3%). Factors like age, size and coat length, as well as the way of living, attitude and diet were linked to a significantly higher (p < 0.05) prevalence of intestinal parasites. Based on the results of coprological diagnostics, this research indicates the importance of educating dog owners, conducting routine parasitological tests on their pets and regular deworming strategies

    REFUGE Challenge: A unified framework for evaluating automated methods for glaucoma assessment from fundus photographs

    Full text link
    [EN] Glaucoma is one of the leading causes of irreversible but preventable blindness in working age populations. Color fundus photography (CFP) is the most cost-effective imaging modality to screen for retinal disorders. However, its application to glaucoma has been limited to the computation of a few related biomarkers such as the vertical cup-to-disc ratio. Deep learning approaches, although widely applied for medical image analysis, have not been extensively used for glaucoma assessment due to the limited size of the available data sets. Furthermore, the lack of a standardize benchmark strategy makes difficult to compare existing methods in a uniform way. In order to overcome these issues we set up the Retinal Fundus Glaucoma Challenge, REFUGE (https://refuge.grand-challenge.org), held in conjunction with MIC-CAI 2018. The challenge consisted of two primary tasks, namely optic disc/cup segmentation and glaucoma classification. As part of REFUGE, we have publicly released a data set of 1200 fundus images with ground truth segmentations and clinical glaucoma labels, currently the largest existing one. We have also built an evaluation framework to ease and ensure fairness in the comparison of different models, encouraging the development of novel techniques in the field. 12 teams qualified and participated in the online challenge. This paper summarizes their methods and analyzes their corresponding results. In particular, we observed that two of the top-ranked teams outperformed two human experts in the glaucoma classification task. Furthermore, the segmentation results were in general consistent with the ground truth annotations, with complementary outcomes that can be further exploited by ensembling the results.This work was supported by the Christian Doppler Research Association, the Austrian Federal Ministry for Digital and Economic Affairs and the National Foundation for Research, Technology and Development, J.I.O is supported by WWTF (Medical University of Vienna: AugUniWien/FA7464A0249, University of Vienna: VRG12- 009). Team Masker is supported by Natural Science Foundation of Guangdong Province of China (Grant 2017A030310647). Team BUCT is partially supported by the National Natural Science Foundation of China (Grant 11571031). The authors would also like to thank REFUGE study group for collaborating with this challenge.Orlando, JI.; Fu, H.; Breda, JB.; Van Keer, K.; Bathula, DR.; Diaz-Pinto, A.; Fang, R.... (2020). REFUGE Challenge: A unified framework for evaluating automated methods for glaucoma assessment from fundus photographs. Medical Image Analysis. 59:1-21. https://doi.org/10.1016/j.media.2019.101570S12159Abramoff, M. D., Garvin, M. K., & Sonka, M. (2010). Retinal Imaging and Image Analysis. IEEE Reviews in Biomedical Engineering, 3, 169-208. doi:10.1109/rbme.2010.2084567Abràmoff, M. D., Lavin, P. T., Birch, M., Shah, N., & Folk, J. C. (2018). Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices. npj Digital Medicine, 1(1). doi:10.1038/s41746-018-0040-6Al-Bander, B., Williams, B., Al-Nuaimy, W., Al-Taee, M., Pratt, H., & Zheng, Y. (2018). Dense Fully Convolutional Segmentation of the Optic Disc and Cup in Colour Fundus for Glaucoma Diagnosis. Symmetry, 10(4), 87. doi:10.3390/sym10040087Almazroa, A., Burman, R., Raahemifar, K., & Lakshminarayanan, V. (2015). Optic Disc and Optic Cup Segmentation Methodologies for Glaucoma Image Detection: A Survey. Journal of Ophthalmology, 2015, 1-28. doi:10.1155/2015/180972Burlina, P. M., Joshi, N., Pekala, M., Pacheco, K. D., Freund, D. E., & Bressler, N. M. (2017). Automated Grading of Age-Related Macular Degeneration From Color Fundus Images Using Deep Convolutional Neural Networks. JAMA Ophthalmology, 135(11), 1170. doi:10.1001/jamaophthalmol.2017.3782Carmona, E. J., Rincón, M., García-Feijoó, J., & Martínez-de-la-Casa, J. M. (2008). Identification of the optic nerve head with genetic algorithms. Artificial Intelligence in Medicine, 43(3), 243-259. doi:10.1016/j.artmed.2008.04.005Chawla, N. V., Bowyer, K. W., Hall, L. O., & Kegelmeyer, W. P. (2002). SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research, 16, 321-357. doi:10.1613/jair.953Christopher, M., Belghith, A., Bowd, C., Proudfoot, J. A., Goldbaum, M. H., Weinreb, R. N., … Zangwill, L. M. (2018). Performance of Deep Learning Architectures and Transfer Learning for Detecting Glaucomatous Optic Neuropathy in Fundus Photographs. Scientific Reports, 8(1). doi:10.1038/s41598-018-35044-9De Fauw, J., Ledsam, J. R., Romera-Paredes, B., Nikolov, S., Tomasev, N., Blackwell, S., … Ronneberger, O. (2018). Clinically applicable deep learning for diagnosis and referral in retinal disease. Nature Medicine, 24(9), 1342-1350. doi:10.1038/s41591-018-0107-6Decencière, E., Zhang, X., Cazuguel, G., Lay, B., Cochener, B., Trone, C., … Klein, J.-C. (2014). FEEDBACK ON A PUBLICLY DISTRIBUTED IMAGE DATABASE: THE MESSIDOR DATABASE. Image Analysis & Stereology, 33(3), 231. doi:10.5566/ias.1155DeLong, E. R., DeLong, D. M., & Clarke-Pearson, D. L. (1988). Comparing the Areas under Two or More Correlated Receiver Operating Characteristic Curves: A Nonparametric Approach. Biometrics, 44(3), 837. doi:10.2307/2531595European Glaucoma Society Terminology and Guidelines for Glaucoma, 4th Edition - Part 1Supported by the EGS Foundation. (2017). British Journal of Ophthalmology, 101(4), 1-72. doi:10.1136/bjophthalmol-2016-egsguideline.001Farbman, Z., Fattal, R., Lischinski, D., & Szeliski, R. (2008). Edge-preserving decompositions for multi-scale tone and detail manipulation. ACM Transactions on Graphics, 27(3), 1-10. doi:10.1145/1360612.1360666Fu, H., Cheng, J., Xu, Y., Wong, D. W. K., Liu, J., & Cao, X. (2018). Joint Optic Disc and Cup Segmentation Based on Multi-Label Deep Network and Polar Transformation. IEEE Transactions on Medical Imaging, 37(7), 1597-1605. doi:10.1109/tmi.2018.2791488Gómez-Valverde, J. J., Antón, A., Fatti, G., Liefers, B., Herranz, A., Santos, A., … Ledesma-Carbayo, M. J. (2019). Automatic glaucoma classification using color fundus images based on convolutional neural networks and transfer learning. Biomedical Optics Express, 10(2), 892. doi:10.1364/boe.10.000892Gulshan, V., Peng, L., Coram, M., Stumpe, M. C., Wu, D., Narayanaswamy, A., … Webster, D. R. (2016). Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs. JAMA, 316(22), 2402. doi:10.1001/jama.2016.17216Hagiwara, Y., Koh, J. E. W., Tan, J. H., Bhandary, S. V., Laude, A., Ciaccio, E. J., … Acharya, U. R. (2018). Computer-aided diagnosis of glaucoma using fundus images: A review. Computer Methods and Programs in Biomedicine, 165, 1-12. doi:10.1016/j.cmpb.2018.07.012Haleem, M. S., Han, L., van Hemert, J., & Li, B. (2013). Automatic extraction of retinal features from colour retinal images for glaucoma diagnosis: A review. Computerized Medical Imaging and Graphics, 37(7-8), 581-596. doi:10.1016/j.compmedimag.2013.09.005Holm, S., Russell, G., Nourrit, V., & McLoughlin, N. (2017). DR HAGIS—a fundus image database for the automatic extraction of retinal surface vessels from diabetic patients. Journal of Medical Imaging, 4(1), 014503. doi:10.1117/1.jmi.4.1.014503Joshi, G. D., Sivaswamy, J., & Krishnadas, S. R. (2011). Optic Disk and Cup Segmentation From Monocular Color Retinal Images for Glaucoma Assessment. IEEE Transactions on Medical Imaging, 30(6), 1192-1205. doi:10.1109/tmi.2011.2106509Kaggle, 2015. Diabetic Retinopathy Detection. https://www.kaggle.com/c/diabetic-retinopathy-detection. [Online; accessed 10-January-2019].Kumar, J. R. H., Seelamantula, C. S., Kamath, Y. S., & Jampala, R. (2019). Rim-to-Disc Ratio Outperforms Cup-to-Disc Ratio for Glaucoma Prescreening. Scientific Reports, 9(1). doi:10.1038/s41598-019-43385-2Lavinsky, F., Wollstein, G., Tauber, J., & Schuman, J. S. (2017). The Future of Imaging in Detecting Glaucoma Progression. Ophthalmology, 124(12), S76-S82. doi:10.1016/j.ophtha.2017.10.011Lecun, Y., Bottou, L., Bengio, Y., & Haffner, P. (1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11), 2278-2324. doi:10.1109/5.726791Li, Z., He, Y., Keel, S., Meng, W., Chang, R. T., & He, M. (2018). Efficacy of a Deep Learning System for Detecting Glaucomatous Optic Neuropathy Based on Color Fundus Photographs. Ophthalmology, 125(8), 1199-1206. doi:10.1016/j.ophtha.2018.01.023Litjens, G., Kooi, T., Bejnordi, B. E., Setio, A. A. A., Ciompi, F., Ghafoorian, M., … Sánchez, C. I. (2017). A survey on deep learning in medical image analysis. Medical Image Analysis, 42, 60-88. doi:10.1016/j.media.2017.07.005Liu, S., Graham, S. L., Schulz, A., Kalloniatis, M., Zangerl, B., Cai, W., … You, Y. (2018). A Deep Learning-Based Algorithm Identifies Glaucomatous Discs Using Monoscopic Fundus Photographs. Ophthalmology Glaucoma, 1(1), 15-22. doi:10.1016/j.ogla.2018.04.002Lowell, J., Hunter, A., Steel, D., Basu, A., Ryder, R., Fletcher, E., & Kennedy, L. (2004). Optic Nerve Head Segmentation. IEEE Transactions on Medical Imaging, 23(2), 256-264. doi:10.1109/tmi.2003.823261Maier-Hein, L., Eisenmann, M., Reinke, A., Onogur, S., Stankovic, M., Scholz, P., … Kopp-Schneider, A. (2018). Why rankings of biomedical image analysis competitions should be interpreted with care. Nature Communications, 9(1). doi:10.1038/s41467-018-07619-7Miri, M. S., Abramoff, M. D., Lee, K., Niemeijer, M., Wang, J.-K., Kwon, Y. H., & Garvin, M. K. (2015). Multimodal Segmentation of Optic Disc and Cup From SD-OCT and Color Fundus Photographs Using a Machine-Learning Graph-Based Approach. IEEE Transactions on Medical Imaging, 34(9), 1854-1866. doi:10.1109/tmi.2015.2412881Niemeijer, M., van Ginneken, B., Cree, M. J., Mizutani, A., Quellec, G., Sanchez, C. I., … Abramoff, M. D. (2010). Retinopathy Online Challenge: Automatic Detection of Microaneurysms in Digital Color Fundus Photographs. IEEE Transactions on Medical Imaging, 29(1), 185-195. doi:10.1109/tmi.2009.2033909Odstrcilik, J., Kolar, R., Budai, A., Hornegger, J., Jan, J., Gazarek, J., … Angelopoulou, E. (2013). Retinal vessel segmentation by improved matched filtering: evaluation on a new high‐resolution fundus image database. IET Image Processing, 7(4), 373-383. doi:10.1049/iet-ipr.2012.0455Orlando, J. I., Prokofyeva, E., & Blaschko, M. B. (2017). A Discriminatively Trained Fully Connected Conditional Random Field Model for Blood Vessel Segmentation in Fundus Images. IEEE Transactions on Biomedical Engineering, 64(1), 16-27. doi:10.1109/tbme.2016.2535311Park, S. J., Shin, J. Y., Kim, S., Son, J., Jung, K.-H., & Park, K. H. (2018). A Novel Fundus Image Reading Tool for Efficient Generation of a Multi-dimensional Categorical Image Database for Machine Learning Algorithm Training. Journal of Korean Medical Science, 33(43). doi:10.3346/jkms.2018.33.e239Poplin, R., Varadarajan, A. V., Blumer, K., Liu, Y., McConnell, M. V., Corrado, G. S., … Webster, D. R. (2018). Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning. Nature Biomedical Engineering, 2(3), 158-164. doi:10.1038/s41551-018-0195-0Porwal, P., Pachade, S., Kamble, R., Kokare, M., Deshmukh, G., Sahasrabuddhe, V., & Meriaudeau, F. (2018). Indian Diabetic Retinopathy Image Dataset (IDRiD): A Database for Diabetic Retinopathy Screening Research. Data, 3(3), 25. doi:10.3390/data3030025Prokofyeva, E., & Zrenner, E. (2012). Epidemiology of Major Eye Diseases Leading to Blindness in Europe: A Literature Review. Ophthalmic Research, 47(4), 171-188. doi:10.1159/000329603Raghavendra, U., Fujita, H., Bhandary, S. V., Gudigar, A., Tan, J. H., & Acharya, U. R. (2018). Deep convolution neural network for accurate diagnosis of glaucoma using digital fundus images. Information Sciences, 441, 41-49. doi:10.1016/j.ins.2018.01.051Reis, A. S. C., Sharpe, G. P., Yang, H., Nicolela, M. T., Burgoyne, C. F., & Chauhan, B. C. (2012). Optic Disc Margin Anatomy in Patients with Glaucoma and Normal Controls with Spectral Domain Optical Coherence Tomography. Ophthalmology, 119(4), 738-747. doi:10.1016/j.ophtha.2011.09.054Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., … Fei-Fei, L. (2015). ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision, 115(3), 211-252. doi:10.1007/s11263-015-0816-ySchmidt-Erfurth, U., Sadeghipour, A., Gerendas, B. S., Waldstein, S. M., & Bogunović, H. (2018). Artificial intelligence in retina. Progress in Retinal and Eye Research, 67, 1-29. doi:10.1016/j.preteyeres.2018.07.004Sevastopolsky, A. (2017). Optic disc and cup segmentation methods for glaucoma detection with modification of U-Net convolutional neural network. Pattern Recognition and Image Analysis, 27(3), 618-624. doi:10.1134/s1054661817030269Taha, A. A., & Hanbury, A. (2015). Metrics for evaluating 3D medical image segmentation: analysis, selection, and tool. BMC Medical Imaging, 15(1). doi:10.1186/s12880-015-0068-xThakur, N., & Juneja, M. (2018). Survey on segmentation and classification approaches of optic cup and optic disc for diagnosis of glaucoma. Biomedical Signal Processing and Control, 42, 162-189. doi:10.1016/j.bspc.2018.01.014Tham, Y.-C., Li, X., Wong, T. Y., Quigley, H. A., Aung, T., & Cheng, C.-Y. (2014). Global Prevalence of Glaucoma and Projections of Glaucoma Burden through 2040. Ophthalmology, 121(11), 2081-2090. doi:10.1016/j.ophtha.2014.05.013Johnson, S. S., Wang, J.-K., Islam, M. S., Thurtell, M. J., Kardon, R. H., & Garvin, M. K. (2018). Local Estimation of the Degree of Optic Disc Swelling from Color Fundus Photography. Lecture Notes in Computer Science, 277-284. doi:10.1007/978-3-030-00949-6_33Trucco, E., Ruggeri, A., Karnowski, T., Giancardo, L., Chaum, E., Hubschman, J. P., … Dhillon, B. (2013). Validating Retinal Fundus Image Analysis Algorithms: Issues and a Proposal. Investigative Opthalmology & Visual Science, 54(5), 3546. doi:10.1167/iovs.12-10347Vergara, I. A., Norambuena, T., Ferrada, E., Slater, A. W., & Melo, F. (2008). StAR: a simple tool for the statistical comparison of ROC curves. BMC Bioinformatics, 9(1). doi:10.1186/1471-2105-9-265Wu, Z., Shen, C., & van den Hengel, A. (2019). Wider or Deeper: Revisiting the ResNet Model for Visual Recognition. Pattern Recognition, 90, 119-133. doi:10.1016/j.patcog.2019.01.006Zheng, Y., Hijazi, M. H. A., & Coenen, F. (2012). Automated «Disease/No Disease» Grading of Age-Related Macular Degeneration by an Image Mining Approach. Investigative Opthalmology & Visual Science, 53(13), 8310. doi:10.1167/iovs.12-957

    Robustness of common hemodynamic indicators with respect to numerical resolution in 38 middle cerebral artery aneurysms

    Get PDF
    Background: Using computational fluid dynamics (CFD) to compute the hemodynamics in cerebral aneurysms has received much attention in the last decade. The usability of these methods depends on the quality of the computations, highlighted in recent discussions. The purpose of this study is to investigate the convergence of common hemodynamic indicators with respect to numerical resolution. Methods: 38 middle cerebral artery bifurcation aneurysms were studied at two different resolutions (one comparable to most studies, and one finer). Relevant hemodynamic indicators were collected from two of the most cited studies, and were compared at the two refinements. In addition, correlation to rupture was investigated. Results: Most of the hemodynamic indicators were very well resolved at the coarser resolutions, correlating with the finest resolution with a correlation coefficient >0.95. The oscillatory shear index (OSI) had the lowest correlation coefficient of 0.83. A logarithmic Bland-Altman plot revealed noticeable variations in the proportion of the aneurysm under low shear, as well as in spatial and temporal gradients not captured by the correlation alone. Conclusion: Statistically, hemodynamic indicators agree well across the different resolutions studied here. However, there are clear outliers visible in several of the hemodynamic indicators, which suggests that special care should be taken when considering individual assessment
    corecore