2,653 research outputs found

    Segmentation and classification of burn images by color and texture information

    Get PDF
    In this paper, a burn color image segmentation and classification system is proposed. The aim of the system is to separate burn wounds from healthy skin, and to distinguish among the different types of burns (burn depths). Digital color photographs are used as inputs to the system. The system is based on color and texture information, since these are the characteristics observed by physicians in order to form a diagnosis. A perceptually uniform color space (L *u*v *) was used, since Euclidean distances calculated in this space correspond to perceptual color differences. After the burn is segmented, a set of color and texture features is calculated that serves as the input to a Fuzzy-ARTMAP neural network. The neural network classifies burns into three types of burn depths: superficial dermal, deep dermal, and full thickness. Clinical effectiveness of the method was demonstrated on 62 clinical burn wound images, yielding an average classification success rate of 82

    CAD Tool for Burn Diagnosis

    Get PDF
    In this paper a new system for burn diagnosis is proposed. The aim of the system is to separate burn wounds from healthy skin, and the different types of burns (burn depths) from each other, identifying each one. The system is based on the colour and texture information, as these are the characteristics observed by physicians in order to give a diagnosis. We use a perceptually uniform colour space (L*u*v*), since Euclidean distances calculated in this space correspond to perceptually colour differences. After the burn is segmented, some colour and texture descriptors are calculated and they are the inputs to a Fuzzy-ARTMAP neural network. The neural network classifies them into three types of burns: superficial dermal, deep dermal and full thickness. Clinical effectiveness of the method was demonstrated on 62 clinical burn wound images obtained from digital colour photographs, yielding an average classification success rate of 82 % compared to expert classified images

    A Comparative Study of Segmentation Algorithms in the Classification of Human Skin Burn Depth

    Get PDF
    A correct first assessment of a skin burn depth is essential as it determines a correct first burn treatment provided to the patients. The objective of this paper is to conduct a comparative study of the different segmentation algorithms for the classification of different burn depths. Eight different hybrid segmentation algorithms were studied on a skin burn dataset comprising skin burn images categorized into three burn classes by medical experts; superficial partial thickness burn (SPTB), deep partial thickness burn (DPTB) and full thickness burn (FTB). Different sequences of the algorithm were experimented as each algorithm was able to segment differently, leading to different segmentation in the final output. The performance of the segmentation algorithms was evaluated by calculating the number of correctly segmented images for each burn depth. The empirical results showed that the segmentation algorithm that was able to segment most of the burn depths had achieved 40.24%, 60.42% and 6.25% of correctly segmented image for SPTB, DPTB and FTB respectively. Most of the segmentation algorithms could not segment well for FTB images because of the different nature of the burn wounds as some of the FTB images contained dark brown and black colors. It can be concluded that a good segmentation algorithm is required to ensure that the representative features of each burn depth can be extracted to contribute to higher accuracy of classification of skin burn depth

    Mobile Wound Assessment and 3D Modeling from a Single Image

    Get PDF
    The prevalence of camera-enabled mobile phones have made mobile wound assessment a viable treatment option for millions of previously difficult to reach patients. We have designed a complete mobile wound assessment platform to ameliorate the many challenges related to chronic wound care. Chronic wounds and infections are the most severe, costly and fatal types of wounds, placing them at the center of mobile wound assessment. Wound physicians assess thousands of single-view wound images from all over the world, and it may be difficult to determine the location of the wound on the body, for example, if the wound is taken at close range. In our solution, end-users capture an image of the wound by taking a picture with their mobile camera. The wound image is segmented and classified using modern convolution neural networks, and is stored securely in the cloud for remote tracking. We use an interactive semi-automated approach to allow users to specify the location of the wound on the body. To accomplish this we have created, to the best our knowledge, the first 3D human surface anatomy labeling system, based off the current NYU and Anatomy Mapper labeling systems. To interactively view wounds in 3D, we have presented an efficient projective texture mapping algorithm for texturing wounds onto a 3D human anatomy model. In so doing, we have demonstrated an approach to 3D wound reconstruction that works even for a single wound image

    SKINCure: An Innovative Smart Phone-Based Application to Assist in Melanoma Early Detection and Prevention

    Get PDF
    Melanoma spreads through metastasis, and therefore it has been proven to be very fatal. Statistical evidence has revealed that the majority of deaths resulting from skin cancer are as a result of melanoma. Further investigations have shown that the survival rates in patients depend on the stage of the infection; early detection and intervention of melanoma implicates higher chances of cure. Clinical diagnosis and prognosis of melanoma is challenging since the processes are prone to misdiagnosis and inaccuracies due to doctors’ subjectivity. This paper proposes an innovative and fully functional smart-phone based application to assist in melanoma early detection and prevention. The application has two major components; the first component is a real-time alert to help users prevent skin burn caused by sunlight; a novel equation to compute the time for skin to burn is thereby introduced. The second component is an automated image analysis module which contains image acquisition, hair detection and exclusion, lesion segmentation, feature extraction, and classification. The proposed system exploits PH2 Dermoscopy image database from Pedro Hispano Hospital for development and testing purposes. The image database contains a total of 200 dermoscopy images of lesions, including normal, atypical, and melanoma cases. The experimental results show that the proposed system is efficient, achieving classification of the normal, atypical and melanoma images with accuracy of 96.3%, 95.7% and 97.5%, respectively

    Burning Skin Detection System in Human Body

    Get PDF
    Early accurate burn depth diagnosis is crucial for selecting the appropriate clinical intervention strategies and assessing burn patient prognosis quality. However, with limited diagnostic accuracy, the current burn depth diagnosis approach still primarily relies on the empirical subjective assessment of clinicians. With the quick development of artificial intelligence technology, integration of deep learning algorithms with image analysis technology can more accurately identify and evaluate the information in medical images. The objective of the work is to detect and classify burn area in medical images using an unsupervised deep learning algorithm. The main contribution is to developing computations using one of the deep learning algorithm. To demonstrate the effectiveness of the proposed framework, experiments are performed on the benchmark to evaluate system stability. The results indicate that, the proposed system is simple and suits real life applications. The system accuracy was 75%, when compared with some of the state-of-the-art techniques

    Detection of unhealthy region of plant leaves and classification of plant leaf diseases using texture features

    Get PDF
    Plant diseases have turned into a dilemma as it can cause significant reduction in both quality and quantity of agricultural products.  Automatic detection of plant diseases is an essential research topic as it may prove benefits in monitoring large fields of crops, and thus automatically detect the symptoms of diseases as soon as they appear on plant leaves.  The proposed system is a software solution for automatic detection and classification of plant leaf diseases.  The developed processing scheme consists of four main steps, first a color transformation structure for the input RGB image is created, then the green pixels are masked and removed using specific threshold value followed by segmentation process, the texture statistics are computed for the useful segments, finally the extracted features are passed through the classifier.  The proposed algorithm’s efficiency can successfully detect and classify the examined diseases with an accuracy of 94%.  Experimental results on a database of about 500 plant leaves confirm the robustness of the proposed approach.   Keywords: HSI, color co-occurrence matrix, texture, SVM, plant leaf disease
    • …
    corecore