4 research outputs found

    Identification of Autism Disorder Spectrum Based on Facial Expressions

    Get PDF
    The study of emotions has always been a matter for philosophers and psychologists. The purpose of this research is to identify and categorize children with autism disorder using diagnosis of the faces emotion in eight states (neutral, anger, ridicule, hate, fear, happiness, sadness and wonder). The method of this research is descriptive-analytic. To collect samples from 80 children with autism disorder, we take images in each of the 8 cases. MATLAB software has been used to analyse photos accurately. The results show that children who are in first group according to DSM-5, they have at least six mode of the listed eight modes. Children who are in the second category of DSM-5, they experience between three to five feelings. It is also observed that children who are in the third category according to DSM-5, can experience only one to two feelings

    Cultivating Insight: Detecting Autism Spectrum Disorder through Residual Attention Network in Facial Image Analysis

    Get PDF
    Revolutionizing Autism Spectrum Disorder Identification through Deep Learning: Unveiling Facial Activation Patterns. In this study, our primary objective is to harness the power of deep learning algorithms for the precise identification of individuals with autism spectrum disorder (ASD) solely from facial image datasets. Our investigation centers around the utilization of face activation patterns, aiming to uncover novel insights into the distinctive facial features of ASD patients. To accomplish this, we meticulously examined facial imaging data from a global and multidisciplinary repository known as the Autism Face Imaging Data Exchange. Autism spectrum disorder is characterized by inherent social deficits and manifests in a spectrum of diverse symptomatic scenarios. Recent data from the Centers for Disease Control (CDC) underscores the significance of this disorder, indicating that approximately 1 in 54 children are impacted by ASD, according to estimations from the CDC's Autism and Developmental Disabilities Monitoring Network (ADDM). Our research delved into the intricate functional connectivity patterns that objectively distinguish ASD participants, focusing on their facial imaging data. Through this investigation, we aimed to uncover the latent facial patterns that play a pivotal role in the classification of ASD cases. Our approach introduces a novel module that enhances the discriminative potential of standard convolutional neural networks (CNNs), such as ResNet-50, thus significantly advancing the state-of-the-art. Our model achieved an impressive accuracy rate of 99% in distinguishing between ASD patients and control subjects within the dataset. Our findings illuminate the specific facial expression domains that contribute most significantly to the differentiation of ASD cases from typically developing individuals, as inferred from our deep learning methodology. To validate our approach, we conducted real-time video testing on diverse children, achieving an outstanding accuracy score of 99.90% and an F1 score of 99.67%. Through this pioneering work, we not only offer a cutting-edge approach to ASD identification but also contribute to the understanding of the underlying facial activation patterns that hold potential for transforming the diagnostic landscape of autism spectrum disorder

    Cognitive emotions in e-learning processes and their potential relationship with students’ academic adjustment

    Get PDF
    In times of growing importance and emphasis on improving academic outcomes for young people, their academic selves/lives are increasingly becoming more central to their understanding of their own wellbeing. How they experience and perceive their academic successes or failures, can influence their perceived self-efficacy and eventual academic achievement. To this end, ‘cognitive emotions’, elicited to acquire or develop new skills/knowledges, can play a crucial role as they indicate the state or the “flow” of a student’s emotions, when facing challenging tasks. Within innovative teaching models, measuring the affective components of learning have been mainly based on self-reports and scales which have neglected the real-time detection of emotions, through for example, recording or measuring facial expressions. The aim of the present study is to test the reliability of an ad hoc software trained to detect and classify cognitive emotions from facial expressions across two different environments, namely a video-lecture and a chat with teacher, and to explore cognitive emotions in relation to academic e-selfefficacy and academic adjustment. To pursue these goals, we used video-recordings of ten psychology students from an online university engaging in online learning tasks, and employed software to automatically detect eleven cognitive emotions. Preliminary results support and extend prior studies, illustrating how exploring cognitive emotions in real time can inform the development and success of academic e-learning interventions aimed at monitoring and promoting students’ wellbeing.peer-reviewe

    Visual analysis of faces with application in biometrics, forensics and health informatics

    Get PDF
    corecore