204 research outputs found

    Icanlearn: A Mobile Application For Creating Flashcards And Social Stories\u3csup\u3etm\u3c/sup\u3e For Children With Autistm

    Get PDF
    The number of children being diagnosed with Autism Spectrum Disorder (ASD) is on the rise, presenting new challenges for their parents and teachers to overcome. At the same time, mobile computing has been seeping its way into every aspect of our lives in the form of smartphones and tablet computers. It seems only natural to harness the unique medium these devices provide and use it in treatment and intervention for children with autism. This thesis discusses and evaluates iCanLearn, an iOS flashcard app with enough versatility to construct Social StoriesTM. iCanLearn provides an engaging, individualized learning experience to children with autism on a single device, but the most powerful way to use iCanLearn is by connecting two or more devices together in a teacher-learner relationship. The evaluation results are presented at the end of the thesis

    Behavioral patterns in robotic collaborative assembly: comparing neurotypical and Autism Spectrum Disorder participants

    Get PDF
    Introduction: In Industry 4.0, collaborative tasks often involve operators working with collaborative robots (cobots) in shared workspaces. Many aspects of the operator's well-being within this environment still need in-depth research. Moreover, these aspects are expected to differ between neurotypical (NT) and Autism Spectrum Disorder (ASD) operators. Methods: This study examines behavioral patterns in 16 participants (eight neurotypical, eight with high-functioning ASD) during an assembly task in an industry-like lab-based robotic collaborative cell, enabling the detection of potential risks to their well-being during industrial human-robot collaboration. Each participant worked on the task for five consecutive days, 3.5 h per day. During these sessions, six video clips of 10 min each were recorded for each participant. The videos were used to extract quantitative behavioral data using the NOVA annotation tool and analyzed qualitatively using an ad-hoc observational grid. Also, during the work sessions, the researchers took unstructured notes of the observed behaviors that were analyzed qualitatively. Results: The two groups differ mainly regarding behavior (e.g., prioritizing the robot partner, gaze patterns, facial expressions, multi-tasking, and personal space), adaptation to the task over time, and the resulting overall performance. Discussion: This result confirms that NT and ASD participants in a collaborative shared workspace have different needs and that the working experience should be tailored depending on the end-user's characteristics. The findings of this study represent a starting point for further efforts to promote well-being in the workplace. To the best of our knowledge, this is the first work comparing NT and ASD participants in a collaborative industrial scenario

    Art Therapy Interventions that Facilitate Non-Verbal Expressions and how Art Therapy Can Improve Communication for Children with Autism

    Get PDF
    Autism prevalence in America has increased rapidly over the years and numerous research studies have been conducted to measure the benefits of art therapy practice within this community. Advances in the mental health field have been also made to help offer improved treatment. Art therapy uses the process of engaging in artmaking with various mediums to make products that can be used in both assessment and treatment. The art can serve as a reflection of a person’s development, abilities and personalities. In this paper, the literature review will cover: Autism, communication, available treatments, and art therapy. The research methodology employed is qualitative research in the form of a single case study. The case study illustrates art therapy interventions used with a child, to help increase social skills and attention span in classroom activities. This paper discusses how art therapy interventions help facilitate non-verbal expression in children with autism and how art therapy improves communication among this population

    Bridging the gap between emotion and joint action

    Get PDF
    Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies

    Spontaneous Facial Behavior Computing in Human Machine Interaction with Applications in Autism Treatment

    Get PDF
    Digital devices and computing machines such as computers, hand-held devices and robots are becoming an important part of our daily life. To have affect-aware intelligent Human-Machine Interaction (HMI) systems, scientists and engineers have aimed to design interfaces which can emulate face-to-face communication. Such HMI systems are capable of detecting and responding upon users\u27 emotions and affective states. One of the main challenges for producing such intelligent system is to design a machine, which can automatically compute spontaneous behaviors of humans in real-life settings. Since humans\u27 facial behaviors contain important non-verbal cues, this dissertation studies facial actions and behaviors in HMI systems. The main two objectives of this dissertation are: 1- capturing, annotating and computing spontaneous facial expressions in a Human-Computer Interaction (HCI) system and releasing a database that allows researchers to study the dynamics of facial muscle movements in both posed and spontaneous data. 2- developing and deploying a robot-based intervention protocol for autism therapeutic applications and modeling facial behaviors of children with high-functioning autism in a real-world Human-Robot Interaction (HRI) system. Because of the lack of data for analyzing the dynamics of spontaneous facial expressions, my colleagues and I introduced and released a novel database called Denver Intensity of Spontaneous Facial Actions (DISFA) . DISFA describes facial expressions using Facial Action Coding System (FACS) - a gold standard technique which annotates facial muscle movements in terms of a set of defined Action Units (AUs). This dissertation also introduces an automated system for recognizing DISFA\u27s facial expressions and dynamics of AUs in a single image or sequence of facial images. Results illustrate that our automated system is capable of computing AU dynamics with high accuracy (overall reliability ICC = 0.77). In addition, this dissertation investigates and computes the dynamics and temporal patterns of both spontaneous and posed facial actions, which can be used to automatically infer the meaning of facial expressions. Another objective of this dissertation is to analyze and compute facial behaviors (i.e. eye gaze and head orientation) of individuals in real-world HRI system. Due to the fact that children with Autism Spectrum Disorder (ASD) show interest toward technology, we designed and conducted a set of robot-based games to study and foster the socio-behavioral responses of children diagnosed with high-functioning ASD. Computing the gaze direction and head orientation patterns illustrate how individuals with ASD regulate their facial behaviors differently (compared to typically developing children) when interacting with a robot. In addition, studying the behavioral responses of participants during different phases of this study (i.e. baseline, intervention and follow-up) reveals that overall, a robot-based therapy setting can be a viable approach for helping individuals with autism

    Applications of Robotics for Autism Spectrum Disorder: a Scoping Review

    Get PDF
    Robotic therapies are receiving growing interest in the autism field, especially for the improvement of social skills of children, enhancing traditional human interventions. In this work, we conduct a scoping review of the literature in robotics for autism, providing the largest review on this field from the last five years. Our work underlines the need to better characterize participants and to increase the sample size. It is also important to develop homogeneous training protocols to analyse and compare the results. Nevertheless, 7 out of the 10 Randomized control trials reported a significant impact of robotic therapy. Overall, robot autonomy, adaptability and personalization as well as more standardized outcome measures were pointed as the most critical issues to address in future research

    Fully robotic social environment for teaching and practicing affective interaction: Case of teaching emotion recognition skills to children with autism spectrum disorder, a pilot study

    Get PDF
    21st century brought along a considerable decrease in social interactions, due to the newly emerged lifestyle around the world, which became more noticeable recently of the COVID-19 pandemic. On the other hand, children with autism spectrum disorder have further complications regarding their social interactions with other humans. In this paper, a fully Robotic Social Environment (RSE), designed to simulate the needed social environment for children, especially those with autism is described. An RSE can be used to simulate many social situations, such as affective interpersonal interactions, in which observational learning can take place. In order to investigate the effectiveness of the proposed RSE, it has been tested on a group of children with autism, who had difficulties in emotion recognition, which in turn, can influence social interaction. An A-B-A single case study was designed to show how RSE can help children with autism recognize four basic facial expressions, i.e., happiness, sadness, anger, and fear, through observing the social interactions of two robots speaking about these facial expressions. The results showed that the emotion recognition skills of the participating children were improved. Furthermore, the results showed that the children could maintain and generalize their emotion recognition skills after the intervention period. In conclusion, the study shows that the proposed RSE, along with other rehabilitation methods, can be effective in improving the emotion recognition skills of children with autism and preparing them to enter human social environments

    人間機械系の相互作用特性の理解に基づく行動学習支援に関する研究

    Get PDF
    この博士論文は、全文公表に適さないやむを得ない事由があり要約のみを公表していましたが、解消したため、令和2(2020)年4月20日に全文を公表しました。筑波大学 (University of Tsukuba)201
    corecore