2 research outputs found

    A novel approach for breast ultrasound classification using two-dimensional empirical mode decomposition and multiple features

    Get PDF
    Aim: Breast cancer stands as a prominent cause of female mortality on a global scale, underscoring the critical need for precise and efficient diagnostic techniques. This research significantly enriches the body of knowledge pertaining to breast cancer classification, especially when employing breast ultrasound images, by introducing a novel method rooted in the two dimensional empirical mode decomposition (biEMD) method. In this study, an evaluation of the classification performance is proposed based on various texture features of breast ultrasound images and their corresponding biEMD subbands. Methods: A total of 437 benign and 210 malignant breast ultrasound images were analyzed, preprocessed, and decomposed into three biEMD sub-bands. A variety of features, including the Gray Level Co-occurrence Matrix (GLCM), Local Binary Patterns (LBP), and Histogram of Oriented Gradient (HOG), were extracted, and a feature selection process was performed using the least absolute shrinkage and selection operator method. The study employed GLCM, LBP and HOG, and machine learning techniques, including artificial neural networks (ANN), k-nearest neighbors (kNN), the ensemble method, and statistical discriminant analysis, to classify benign and malignant cases. The classification performance, measured through Area Under the Curve (AUC), accuracy, and F1 score, was evaluated using a 10-fold cross-validation approach. Results: The study showed that using the ANN method and hybrid features (GLCM+LBP+HOG) from BUS images' biEMD sub-bands led to excellent performance, with an AUC of 0.9945, an accuracy of 0.9644, and an F1 score of 0.9668. This has revealed the effectiveness of the biEMD method for classifying breast tumor types from ultrasound images. Conclusion: The obtained results have revealed the effectiveness of the biEMD method for classifying breast tumor types from ultrasound images, demonstrating high-performance classification using the proposed approach

    Can Functional Connectivity at Resting Brain in ADHD Indicate the Impairments in Sensory-Motor Functions and Face/Emotion Recognition?

    No full text
    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disease known to cause impair-ments in cognitive, sensory-motor functions and face/emotion recognition. This study aimed to examine the resting-state brain networks in children with ADHD using functional magnetic resonance imaging. We performed seed-to-voxel and region of interest (ROI) analyses including all Broadmann areas (BAs) comprehensively. Thirty right-handed children aged between 9 and 16 years (15 with ADHD and 15 typically developing control subjects closely matched for age and gender) were included. Ninety five brain regions including 84 BAs and 11 Default Mode network (DMN)-related regions (rsREL) were studied using seed-based and ROI-to-ROI analysis and connectivity measures were calculated (p < 0.001). Between-group differences were assessed by using t-statistics (p < 0.05). Seed-based analysis showed connectivity differences in the sensory-motor and face/emotion recognition regions in both groups. The between-group whole-brain comparison showed greater magnitude of activation in children with ADHD than in control subjects in brain regions that included the face/emotion recognition system and prefrontal cortex based on ROI-to-ROI analysis. This work revealed that the sensory-motor regions and regions related to face/emotion recognition showed atypical functional connectivities in ADHD patients compared to the controls. Observation of the differences in these regions supports previous findings in the literature based on task-based functional magnetic resonance imaging (fMRI) studies. Our study exhibited that these atypical differences can also occur in the resting brain. These results suggest that further investigations of regions related to motor-sensory and face/emotion recognition are required to better understand ADHD
    corecore