216 research outputs found

    AUTOMATIC RECOGNITION OF DENTAL PATHOLOGIES AS PART OF A CLINICAL DECISION SUPPORT PLATFORM

    Get PDF
    The current work is done within the context of Romanian National Program II (PNII) research project "Application for Using Image Data Mining and 3D Modeling in Dental Screening" (AIMMS). The AIMMS project aims to design a program that can detect anatomical information and possible pathological formations from a collection of digital imaging and communications in medicine (DICOM) images. The main function of the AIMMS platform is to provide the user with the opportunity to use an integrated dental support platform, using image processing techniques and 3D modeling. From the literature review, it can be found that for the detection and classification of teeth and dental pathologies existing studies are in their infancy. Therefore, the work reported in this article makes a scientific contribution in this field. In this article it is presented the relevant literature review and algorithms that were created for detection of dental pathologies in the context of research project AIMMS

    Initial contour generation approach in level set methods for dental image segmentation

    Get PDF
    Segmentation is challenging process in medical images especially on dental x-ray images. Level set methods have effective result on medical and dental image segmentation. Initial Contour (IC) is the essential step in level set image segmentation methods due to start the efficient process. However, the main issue with IC is how to generate the automatic technique in order to reduce the human interaction and moreover, suitable IC to have accurate result. In this paper a new region-based technique for IC generation, is proposed to overcome this issue. The idea is to generate the most suitable IC since the manual initialization of the level set function surface is a well-known drawback for accurate segmentation which has dependency on selection of IC and wrong selection will affect the result. We have utilized the statistical and morphological information inside and outside the contour to establish a region-based map function. This function is able to find the suitable IC on images to perform by level set methods. Experiments on dental x-ray images demonstrate the robustness of segmentation process using proposed method even on noisy images and with weak boundary. Furthermore, computational cost of segmentation process will be reduced

    Automated Teeth Extraction and Dental Caries Detection in Panoramic X-ray

    Get PDF
    Dental caries is one of the most chronic diseases that involves the majority of people at least once during their lifetime. This expensive disease accounts for 5-10% of the healthcare budget in developing countries. Caries lesions appear as the result of dental biofi lm metabolic activity, caused by bacteria (most prominently Streptococcus mutans) feeding on uncleaned sugars and starches in oral cavity. Also known as tooth decay, they are primarily diagnosed by general dentists solely based on clinical assessments. Since in many cases dental problems cannot be detected with simple observations, dental x-ray imaging is introduced as a standard tool for domain experts, i.e. dentists and radiologists, to distinguish dental diseases, such as proximal caries. Among different dental radiography methods, Panoramic or Orthopantomogram (OPG) images are commonly performed as the initial step toward assessment. OPG images are captured with a small dose of radiation and can depict the entire patient dentition in a single image. Dental caries can sometimes be hard to identify by general dentists relying only on their visual inspection using dental radiography. Tooth decays can easily be misinterpreted as shadows due to various reasons, such as low image quality. Besides, OPG images have poor quality and structures are not presented with strong edges due to low contrast, uneven exposure, etc. Thus, disease detection is a very challenging task using Panoramic radiography. With the recent development of Artificial Intelligence (AI) in dentistry, and with the introduction of Convolutional Neural Network (CNN) for image classification, developing medical decision support systems is becoming a topic of interest in both academia and industry. Providing more accurate decision support systems using CNNs to assist dentists can enhance their diagnosis performance, resulting in providing improved dental care assistance for patients. In the following thesis, the first automated teeth extraction system for Panoramic images, using evolutionary algorithms, is proposed. In contrast to other intraoral radiography methods, Panoramic is captured with x-ray film outside the patient mouth. Therefore, Panoramic x-rays contain regions outside of the jaw, which make teeth segmentation extremely difficult. Considering that we solely need an image of each tooth separately to build a caries detection model, segmentation of teeth from the OPG image is essential. Due to the absence of significant pixel intensity difference between different regions in OPG radiography, teeth segmentation becomes very hard to implement. Consequently, an automated system is introduced to get an OPG as input and gives images of single teeth as the output. Since only a few research studies are utilizing similar task for Panoramic radiography, there is room for improvement. A genetic algorithm is applied along with different image processing methods to perform teeth extraction by jaw extraction, jaw separation, and teeth-gap valley detection, respectively. The proposed system is compared to the state-of-the-art in teeth extraction on other image types. After teeth are segmented from each image, a model based on various untrained and pretrained CNN-based architectures is proposed to detect dental caries for each tooth. Autoencoder-based model along with famous CNN architectures are used for feature extraction, followed by capsule networks to perform classification. The dataset of Panoramic x-rays is prepared by the authors, with help from an expert radiologist to provide labels. The proposed model has demonstrated an acceptable detection rate of 86.05%, and an increase in caries detection speed. Considering the challenges of performing such task on low quality OPG images, this work is a step towards developing a fully automated efficient caries detection model to assist domain experts

    Automated Teeth Extraction and Dental Caries Detection in Panoramic X-ray

    Get PDF
    Dental caries is one of the most chronic diseases that involves the majority of people at least once during their lifetime. This expensive disease accounts for 5-10% of the healthcare budget in developing countries. Caries lesions appear as the result of dental biofi lm metabolic activity, caused by bacteria (most prominently Streptococcus mutans) feeding on uncleaned sugars and starches in oral cavity. Also known as tooth decay, they are primarily diagnosed by general dentists solely based on clinical assessments. Since in many cases dental problems cannot be detected with simple observations, dental x-ray imaging is introduced as a standard tool for domain experts, i.e. dentists and radiologists, to distinguish dental diseases, such as proximal caries. Among different dental radiography methods, Panoramic or Orthopantomogram (OPG) images are commonly performed as the initial step toward assessment. OPG images are captured with a small dose of radiation and can depict the entire patient dentition in a single image. Dental caries can sometimes be hard to identify by general dentists relying only on their visual inspection using dental radiography. Tooth decays can easily be misinterpreted as shadows due to various reasons, such as low image quality. Besides, OPG images have poor quality and structures are not presented with strong edges due to low contrast, uneven exposure, etc. Thus, disease detection is a very challenging task using Panoramic radiography. With the recent development of Artificial Intelligence (AI) in dentistry, and with the introduction of Convolutional Neural Network (CNN) for image classification, developing medical decision support systems is becoming a topic of interest in both academia and industry. Providing more accurate decision support systems using CNNs to assist dentists can enhance their diagnosis performance, resulting in providing improved dental care assistance for patients. In the following thesis, the first automated teeth extraction system for Panoramic images, using evolutionary algorithms, is proposed. In contrast to other intraoral radiography methods, Panoramic is captured with x-ray film outside the patient mouth. Therefore, Panoramic x-rays contain regions outside of the jaw, which make teeth segmentation extremely difficult. Considering that we solely need an image of each tooth separately to build a caries detection model, segmentation of teeth from the OPG image is essential. Due to the absence of significant pixel intensity difference between different regions in OPG radiography, teeth segmentation becomes very hard to implement. Consequently, an automated system is introduced to get an OPG as input and gives images of single teeth as the output. Since only a few research studies are utilizing similar task for Panoramic radiography, there is room for improvement. A genetic algorithm is applied along with different image processing methods to perform teeth extraction by jaw extraction, jaw separation, and teeth-gap valley detection, respectively. The proposed system is compared to the state-of-the-art in teeth extraction on other image types. After teeth are segmented from each image, a model based on various untrained and pretrained CNN-based architectures is proposed to detect dental caries for each tooth. Autoencoder-based model along with famous CNN architectures are used for feature extraction, followed by capsule networks to perform classification. The dataset of Panoramic x-rays is prepared by the authors, with help from an expert radiologist to provide labels. The proposed model has demonstrated an acceptable detection rate of 86.05%, and an increase in caries detection speed. Considering the challenges of performing such task on low quality OPG images, this work is a step towards developing a fully automated efficient caries detection model to assist domain experts

    Caries detection in panoramic dental x-ray images

    Get PDF
    The detection of dentalcaries,in a preliminar stage are of most importance. There is a long history of dental caries. Over a million years ago, hominids such as Australopithecus su๏ฌ€ered from cavities. Archaeological evidence shows that tooth decay is an ancient disease dating far into prehistory. Skulls dating from a million years ago through the Neolithic period show signs of caries. The increase of caries during the Neolithic period may be attributed to the increase of plant foods containing carbohydrates. The beginning of rice cultivation in South Asia is also believed to have caused an increase in caries. DentalCaries,alsoknownasdentaldecayortoothdecay,isde๏ฌnedasadisease of the hard tissues of the teeth caused by the action of microorganisms, found in plaque,onfermentablecarbohydrates(principallysugars). Attheindividuallevel, dental caries is a preventable disease. Given its dynamic nature the dental caries disease, once established, can be treated or reversed prior to signi๏ฌcant cavitation taking place. There three types of dental caries [59], the ๏ฌrst type is the Enamel Caries, that is preceded by the formation of a microbial dental plaque. Secondly the Dentinal Caries which begins with the natural spread of the process along the natural spread of great numbers of the dentinal tubules. Thirdly the Pulpal Caries that corresponds to the root caries or root surface caries. Primary diagnosis involves inspection of all visible tooth surfaces using a good light source, dental mirror and explorer. Dental radiographs (X-rays) may show dental caries before it is otherwise visible, particularly caries between the teeth. Large dental caries are often apparent to the naked eye, but smaller lesions can be di๏ฌƒcult to identify. Visual and tactile inspection along with radiographs are employed frequently among dentists. At times, caries may be di๏ฌƒcult to detect. Bacteriacanpenetratetheenameltoreachdentin,butthentheoutersurfacemaybe at ๏ฌrst site intact. These caries, sometimes referred to as "hidden caries", in the preliminary stage X-ray are the only way to detect them, despite of the visual examinationofthetoothshowntheenamelintactorminimallyperforated. Without X-rays wouldnโ€™t be possible to detect these problems until they had become severe and caused serious damage. [...

    ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์—์„œ ๋”ฅ๋Ÿฌ๋‹ ์‹ ๊ฒฝ๋ง์„ ์ด์šฉํ•œ ์น˜์„ฑ ๋‚ญ๊ณผ ์ข…์–‘์˜ ์ž๋™ ์ง„๋‹จ ๋ฐฉ๋ฒ•

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์น˜์˜ํ•™๋Œ€ํ•™์› ์น˜์˜ํ•™๊ณผ, 2021. 2. ์ด์›์ง„.Objective: The purpose of this study was to automatically diagnose odontogenic cysts and tumors of the jaw on panoramic radiographs using a deep convolutional neural network. A novel framework method of deep convolutional neural network was proposed with data augmentation for detection and classification of the multiple diseases. Methods: A deep convolutional neural network modified from YOLOv3 was developed for detecting and classifying odontogenic cysts and tumors of the jaw. Our dataset of 1,282 panoramic radiographs comprised 350 dentigerous cysts, 302 periapical cysts, 300 odontogenic keratocysts, 230 ameloblastomas, and 100 normal jaw with no disease. In addition, the number of radiographs was augmented 12-fold by flip, rotation, and intensity changes. The Intersection over union threshold value of 0.5 was used to obtain performance for detection and classification. The classification performance of the developed convolutional neural network was evaluated by calculating sensitivity, specificity, accuracy, and AUC (Area under the ROC curve) for diseases of the jaw. Results: The overall classification performance for the diseases improved from 78.2% sensitivity, 93.9% specificity, 91.3% accuracy, and 0.86 AUC using the convolutional neural network with unaugmented dataset to 88.9% sensitivity, 97.2% specificity, 95.6% accuracy, and 0.94 AUC using the convolutional neural network with augmented dataset. Convolutional neural network using augmented dataset had the following sensitivities, specificities, accuracies, and AUC: 91.4%, 99.2%, 97.8%, and 0.96 for dentigerous cysts, 82.8%, 99.2%, 96.2%, and 0.92 for periapical cysts, 98.4%, 92.3%, 94.0%, and 0.97 for odontogenic keratocysts, 71.7%, 100%, 94.3%, and 0.86 for ameloblastomas, and 100.0%, 95.1%, 96.0%, and 0.94 for normal jaw, respectively. Conclusion: The novel framework convolutional neural network method was developed for automatically diagnosing odontogenic cysts and tumors of the jaw on panoramic radiographs using data augmentation. The proposed convolutional neural network model showed high sensitivity, specificity, accuracy, and AUC despite the limited number of panoramic images involved.1. ๋ชฉ ์  ๊ตฌ๊ฐ•์•…์•ˆ๋ฉด์˜์—ญ์—์„œ ๋ฐœ์ƒํ•˜๋Š” ๋‚ญ์ข… ํ˜น์€ ์ข…์–‘์„ ์กฐ๊ธฐ์— ๋ฐœ๊ฒฌํ•˜์ง€ ๋ชปํ•˜์—ฌ ์ ์ ˆํ•œ ์น˜๋ฃŒ๊ฐ€ ์ด๋ฃจ์–ด์ง€์ง€ ๋ชปํ•˜๊ณ  ์ง€์—ฐ๋˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ์ธ๊ณต์‹ ๊ฒฝ๋ง์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜๋Š” ๊ธฐ๊ณ„ํ•™์Šต ๊ธฐ์ˆ ์ธ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง(deep convolutional neural network)์„ ์ด์šฉํ•˜๋Š” ์ปดํ“จํ„ฐ ๋ณด์กฐ์ง„๋‹จ์€ ๋ณด๋‹ค ์ •ํ™•ํ•˜๊ณ  ๋น ๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์ œ๊ณตํ•  ์ˆ˜ ์žˆ๋‹ค. ๋”ฐ๋ผ์„œ ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์—์„œ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์„ ์ด์šฉํ•˜์—ฌ ๊ตฌ๊ฐ•์•…์•ˆ๋ฉด์—์„œ ์ž์ฃผ ๋‚˜ํƒ€๋‚˜๋Š” 4๊ฐ€์ง€ ์งˆํ™˜(ํ•จ์น˜์„ฑ๋‚ญ, ์น˜๊ทผ๋‹จ๋‹น, ์น˜์„ฑ๊ฐํ™”๋‚ญ, ๋ฒ•๋ž‘๋ชจ์„ธํฌ์ข…)์„ ์ž๋™์œผ๋กœ ๊ฒ€์ถœ ๋ฐ ์ง„๋‹จํ•˜๋Š” ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์„ ๊ฐœ๋ฐœํ•˜๊ณ  ๊ทธ ์ •ํ™•์„ฑ์„ ํ‰๊ฐ€ํ•˜์˜€๋‹ค. 2. ๋ฐฉ ๋ฒ• ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์—์„œ ์•…๊ณจ์— ๋ฐœ์ƒํ•œ ์น˜์„ฑ ๋‚ญ๊ณผ ์ข…์–‘์„ ๊ฒ€์ถœํ•˜๊ณ  ์ง„๋‹จํ•˜๊ธฐ ์œ„ํ•˜์—ฌ YoLoV3๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์„ ๊ตฌ์ถ•ํ•˜์˜€๋‹ค. 1999๋…„๋ถ€ํ„ฐ 2017๋…„๊นŒ์ง€ ์„œ์šธ๋Œ€ํ•™๊ต์น˜๊ณผ๋ณ‘์›์—์„œ ์กฐ์ง๋ณ‘๋ฆฌํ•™์ ์œผ๋กœ ํ™•์ง„๋œ ํ•จ์น˜์„ฑ๋‚ญ 350๋ก€, ์น˜๊ทผ๋‹จ๋‚ญ 302๋ก€, ์น˜์„ฑ๊ฐํ™”๋‚ญ 300๋ก€, ๋ฒ•๋ž‘๋ชจ์„ธํฌ์ข… 230๋ก€์˜ ํ™˜์ž๋กœ๋ถ€ํ„ฐ ํš๋“ํ•œ ์ด 1182๋งค ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์„ ๋ถ„์„ํ•˜์˜€๋‹ค. ๋˜ํ•œ ๋Œ€์กฐ๊ตฐ์œผ๋กœ ์งˆํ™˜์ด ์—†๋Š” ์ •์ƒ ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ 100๋งค๋ฅผ ์„ ํƒํ•˜์˜€๋‹ค. ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ ๋ฐ์ดํ„ฐ๋Š” ๊ฐ๋งˆ, ๋ณด์ •, ํšŒ์ „, ๋’ค์ง‘๊ธฐ ๊ธฐ๋ฒ•์„ ํ†ตํ•˜์—ฌ 12๋ฐฐ ์ฆ๊ฐ•๋˜์—ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ์˜ 60%๋Š” ํ›ˆ๋ จ์„ธํŠธ, 20%๋Š” ๊ฒ€์ฆ์„ธํŠธ, 20%๋Š” ํ…Œ์ŠคํŠธ์„ธํŠธ๋กœ ์‚ฌ์šฉํ•˜์˜€๋‹ค. ๊ฐœ๋ฐœ๋œ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์€ 5๋ฐฐ ๊ต์ฐจ๊ฒ€์ฆ(5-fold cross validation)๊ธฐ๋ฒ•์„ ์ด์šฉํ•˜์—ฌ ํ‰๊ฐ€ํ•˜์˜€๋‹ค. ๋ณธ ์—ฐ๊ตฌ์—์„œ ๊ฐœ๋ฐœํ•œ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์˜ ์„ฑ๋Šฅ์€ ์ •ํ™•๋„(Accuracy), ๋ฏผ๊ฐ๋„(sensitivity), ํŠน์ด๋„(specificity) ๋ฐ ROC๋ถ„์„์„ ํ†ตํ•œ AUC(area under the curve) ์ง€ํ‘œ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ธก์ •ํ•˜์˜€๋‹ค. 3. ๊ฒฐ ๊ณผ ๋ณธ ์—ฐ๊ตฌ์—์„œ ๊ฐœ๋ฐœํ•œ ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์€ ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์„ ํ•˜์ง€ ์•Š์•˜์„ ๋•Œ 78.2% ๋ฏผ๊ฐ๋„, 93.9% ํŠน์ด๋„, 91.3% ์ •ํ™•๋„ ๋ฐ 0.86์˜ AUC ๊ฐ’์„ ๋ณด์˜€๊ณ  ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์„ ํ•˜์˜€์„ ๋•Œ์—๋Š” 88.9% ๋ฏผ๊ฐ๋„, 97.2% ํŠน์ด๋„, 95.6% ์ •ํ™•๋„ ๋ฐ 0.94 AUC์˜ ๊ฐœ์„ ๋œ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ํ•จ์น˜์„ฑ๋‚ญ์€ 91.4% ๋ฏผ๊ฐ๋„, 99.2% ํŠน์ด๋„, 97.8% ์ •ํ™•๋„ ๋ฐ 0.96 AUC ๊ฐ’์„ ๋ณด์˜€๋‹ค. ์น˜๊ทผ๋‹จ๋‚ญ์€ 82.8% ๋ฏผ๊ฐ๋„, 99.2% ํŠน์ด๋„, 96.2% ์ •ํ™•๋„ ๋ฐ 0.92 AUC ๊ฐ’์„ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ์น˜์„ฑ๊ฐํ™”๋‚ญ์€ 98.4% ๋ฏผ๊ฐ๋„, 92.3% ํŠน์ด๋„, 94.0% ์ •ํ™•๋„ ๋ฐ 0.97 AUC ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€๋‹ค. ๋ฒ•๋ž‘๋ชจ์„ธํฌ์ข…์€ 71.7% ๋ฏผ๊ฐ๋„, 100% ํŠน์ด๋„, 94.3% ์ •ํ™•๋„ ๋ฐ 0.86 AUC์˜ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์ •์ƒ์ ์ธ ์•…๊ณจ์—์„œ๋Š” 100% ๋ฏผ๊ฐ๋„, 95.1% ํŠน์ด๋„, 96.0% ์ •ํ™•๋„ ๋ฐ 0.97 AUC๊ฐ’์„ ๊ฐ๊ฐ ๋ณด์˜€๋‹ค. 4. ๊ฒฐ ๋ก  ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์—์„œ ์น˜์„ฑ ๋‚ญ๊ณผ ์ข…์–‘์„ ์ž๋™์œผ๋กœ ๊ฒ€์ถœํ•˜๊ณ  ์ง„๋‹จํ•˜๋Š” ๋”ฅ๋Ÿฌ๋‹์‹ ๊ฒฝ๋ง์„ ๊ฐœ๋ฐœํ•˜์˜€๋‹ค. ๋ณธ ์—ฐ๊ตฌ๋Š” ํŒŒ๋…ธ๋ผ๋งˆ๋ฐฉ์‚ฌ์„ ์˜์ƒ์˜ ์ˆ˜๊ฐ€ ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์•˜์Œ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ  ๋ฐ์ดํ„ฐ ์ฆ๊ฐ• ๊ธฐ๋ฒ•์„ ์ด์šฉํ•˜์—ฌ ์šฐ์ˆ˜ํ•œ ๋ฏผ๊ฐ๋„, ํŠน์ด๋„ ๋ฐ ์ •ํ™•๋„ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€๋‹ค. ๋ณธ ์—ฐ๊ตฌ๊ฒฐ๊ณผ๋ฅผ ํ†ตํ•˜์—ฌ ๊ฐœ๋ฐœ๋œ ์‹œ์Šคํ…œ์€ ํ™˜์ž์˜ ์ƒ๊ธฐ ์งˆํ™˜์„ ์กฐ๊ธฐ์— ์ง„๋‹จํ•˜๊ณ  ์ ์ ˆํ•œ ์‹œ๊ธฐ์— ์น˜๋ฃŒํ•˜๋Š”๋ฐ ์œ ์šฉํ•˜๋‹ค.Contents Abstract i Tables v Figure legends vi Introduction ๏ผ‘ Materials and Methods ๏ผ• Data preparation and augmentation of panoramic radiographs ๏ผ• A deep convolutional neural network model for detection and classification of multiple diseases YOLOv3 ๏ผ™ Evaluation of detection and classification performance of the deep convolutional neural network model ๏ผ‘๏ผ“ Results ๏ผ‘๏ผ• Discussion ๏ผ’๏ผ˜ Conclusion ๏ผ“๏ผ— Acknowledgments ๏ผ“๏ผ˜ References ๏ผ“๏ผ™ ์š”์•ฝ(๊ตญ๋ฌธ์ดˆ๋ก) ๏ผ”๏ผ˜Docto

    Design An Intelligent System to Support Dental Cyst Detection Using Two Convolutional Neural Networks

    Get PDF
    The aim of this paper is to develop a methodology, through studies on Computer Vision techniques, for the automatic identification of dental cysts in panoramic radiography images, providing Dental professionals with an alternative aid in the interpretation of these images. In addition segmentation techniques are applied in the inner region of the jaws, seeking to separate the regions with a greater possibility of cyst. The objective of this work is to design an intelligent system that supports the diagnosis of Dental Cyst using convolutional neural networks in order to help detect Dental Cyst at an early stage. The research method applied in this study consists of model design, where built and trained two convolutional neural network architectures, supporting 80% of the dataset with a total of 775 images with four image categories, and proposal validation, where we work with the remaining 20% of the dataset. Our results show that the ResNet50 architecture achieved the best classification with an accuracy of 98%
    • โ€ฆ
    corecore