204 research outputs found
Icanlearn: A Mobile Application For Creating Flashcards And Social Stories\u3csup\u3etm\u3c/sup\u3e For Children With Autistm
The number of children being diagnosed with Autism Spectrum Disorder (ASD) is on the rise, presenting new challenges for their parents and teachers to overcome. At the same time, mobile computing has been seeping its way into every aspect of our lives in the form of smartphones and tablet computers. It seems only natural to harness the unique medium these devices provide and use it in treatment and intervention for children with autism.
This thesis discusses and evaluates iCanLearn, an iOS flashcard app with enough versatility to construct Social StoriesTM. iCanLearn provides an engaging, individualized learning experience to children with autism on a single device, but the most powerful way to use iCanLearn is by connecting two or more devices together in a teacher-learner relationship. The evaluation results are presented at the end of the thesis
Behavioral patterns in robotic collaborative assembly: comparing neurotypical and Autism Spectrum Disorder participants
Introduction: In Industry 4.0, collaborative tasks often involve operators working with collaborative robots (cobots) in shared workspaces. Many aspects of the operator's well-being within this environment still need in-depth research. Moreover, these aspects are expected to differ between neurotypical (NT) and Autism Spectrum Disorder (ASD) operators.
Methods: This study examines behavioral patterns in 16 participants (eight neurotypical, eight with high-functioning ASD) during an assembly task in an industry-like lab-based robotic collaborative cell, enabling the detection of potential risks to their well-being during industrial human-robot collaboration. Each participant worked on the task for five consecutive days, 3.5 h per day. During these sessions, six video clips of 10 min each were recorded for each participant. The videos were used to extract quantitative behavioral data using the NOVA annotation tool and analyzed qualitatively using an ad-hoc observational grid. Also, during the work sessions, the researchers took unstructured notes of the observed behaviors that were analyzed qualitatively.
Results: The two groups differ mainly regarding behavior (e.g., prioritizing the robot partner, gaze patterns, facial expressions, multi-tasking, and personal space), adaptation to the task over time, and the resulting overall performance.
Discussion: This result confirms that NT and ASD participants in a collaborative shared workspace have different needs and that the working experience should be tailored depending on the end-user's characteristics. The findings of this study represent a starting point for further efforts to promote well-being in the workplace. To the best of our knowledge, this is the first work comparing NT and ASD participants in a collaborative industrial scenario
Art Therapy Interventions that Facilitate Non-Verbal Expressions and how Art Therapy Can Improve Communication for Children with Autism
Autism prevalence in America has increased rapidly over the years and numerous research studies have been conducted to measure the benefits of art therapy practice within this community. Advances in the mental health field have been also made to help offer improved treatment. Art therapy uses the process of engaging in artmaking with various mediums to make products that can be used in both assessment and treatment. The art can serve as a reflection of a person’s development, abilities and personalities. In this paper, the literature review will cover: Autism, communication, available treatments, and art therapy. The research methodology employed is qualitative research in the form of a single case study. The case study illustrates art therapy interventions used with a child, to help increase social skills and attention span in classroom activities. This paper discusses how art therapy interventions help facilitate non-verbal expression in children with autism and how art therapy improves communication among this population
Bridging the gap between emotion and joint action
Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies
Spontaneous Facial Behavior Computing in Human Machine Interaction with Applications in Autism Treatment
Digital devices and computing machines such as computers, hand-held devices and robots are becoming an important part of our daily life. To have affect-aware intelligent Human-Machine Interaction (HMI) systems, scientists and engineers have aimed to design interfaces which can emulate face-to-face communication. Such HMI systems are capable of detecting and responding upon users\u27 emotions and affective states. One of the main challenges for producing such intelligent system is to design a machine, which can automatically compute spontaneous behaviors of humans in real-life settings. Since humans\u27 facial behaviors contain important non-verbal cues, this dissertation studies facial actions and behaviors in HMI systems. The main two objectives of this dissertation are: 1- capturing, annotating and computing spontaneous facial expressions in a Human-Computer Interaction (HCI) system and releasing a database that allows researchers to study the dynamics of facial muscle movements in both posed and spontaneous data. 2- developing and deploying a robot-based intervention protocol for autism therapeutic applications and modeling facial behaviors of children with high-functioning autism in a real-world Human-Robot Interaction (HRI) system.
Because of the lack of data for analyzing the dynamics of spontaneous facial expressions, my colleagues and I introduced and released a novel database called Denver Intensity of Spontaneous Facial Actions (DISFA) . DISFA describes facial expressions using Facial Action Coding System (FACS) - a gold standard technique which annotates facial muscle movements in terms of a set of defined Action Units (AUs). This dissertation also introduces an automated system for recognizing DISFA\u27s facial expressions and dynamics of AUs in a single image or sequence of facial images. Results illustrate that our automated system is capable of computing AU dynamics with high accuracy (overall reliability ICC = 0.77). In addition, this dissertation investigates and computes the dynamics and temporal patterns of both spontaneous and posed facial actions, which can be used to automatically infer the meaning of facial expressions.
Another objective of this dissertation is to analyze and compute facial behaviors (i.e. eye gaze and head orientation) of individuals in real-world HRI system. Due to the fact that children with Autism Spectrum Disorder (ASD) show interest toward technology, we designed and conducted a set of robot-based games to study and foster the socio-behavioral responses of children diagnosed with high-functioning ASD. Computing the gaze direction and head orientation patterns illustrate how individuals with ASD regulate their facial behaviors differently (compared to typically developing children) when interacting with a robot. In addition, studying the behavioral responses of participants during different phases of this study (i.e. baseline, intervention and follow-up) reveals that overall, a robot-based therapy setting can be a viable approach for helping individuals with autism
Applications of Robotics for Autism Spectrum Disorder: a Scoping Review
Robotic therapies are receiving growing interest in the autism field, especially for the improvement of social skills of children, enhancing traditional human interventions. In this work, we conduct a scoping review of the literature in robotics for autism, providing the largest review on this field from the last five years. Our work underlines the need to better characterize participants and to increase the sample size. It is also important to develop homogeneous training protocols to analyse and compare the results. Nevertheless, 7 out of the 10 Randomized control trials reported a significant impact of robotic therapy. Overall, robot autonomy, adaptability and personalization as well as more standardized outcome measures were pointed as the most critical issues to address in future research
Fully robotic social environment for teaching and practicing affective interaction: Case of teaching emotion recognition skills to children with autism spectrum disorder, a pilot study
21st century brought along a considerable decrease in social interactions, due to the newly emerged lifestyle around the world, which became more noticeable recently of the COVID-19 pandemic. On the other hand, children with autism spectrum disorder have further complications regarding their social interactions with other humans. In this paper, a fully Robotic Social Environment (RSE), designed to simulate the needed social environment for children, especially those with autism is described. An RSE can be used to simulate many social situations, such as affective interpersonal interactions, in which observational learning can take place. In order to investigate the effectiveness of the proposed RSE, it has been tested on a group of children with autism, who had difficulties in emotion recognition, which in turn, can influence social interaction. An A-B-A single case study was designed to show how RSE can help children with autism recognize four basic facial expressions, i.e., happiness, sadness, anger, and fear, through observing the social interactions of two robots speaking about these facial expressions. The results showed that the emotion recognition skills of the participating children were improved. Furthermore, the results showed that the children could maintain and generalize their emotion recognition skills after the intervention period. In conclusion, the study shows that the proposed RSE, along with other rehabilitation methods, can be effective in improving the emotion recognition skills of children with autism and preparing them to enter human social environments
Recommended from our members
Aspects of Joint Attention in Autism Spectrum Disorder: Links to Sensory Processing, Social Competence, Maternal Attention, and Contextual Factors
Background. Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by deficits in social interaction, communication, and restricted and repetitive behaviors (American Psychiatric Association, 2013). Given the heterogeneity of ASD it is important to understand individual differences within the disorder that are related to cognitive and language development, and how such differences may be related to differences in caregiver behavior or aspects of the social environment. Joint attention is an important component of early social communication and is considered to be a “core deficit” of ASD (Kasari, Freeman, Paparella, Wong, Kwon, & Gulsrud, 2005). Individual differences in joint attention during infancy have been shown to relate to language and cognitive development (Mundy, Block, Delgado, Pomares, Van Hecke, & Parlade, 2007; Nichols, Martin, & Fox, 2005). Therefore, joint attention serves an essential role in the study of child behavior within ASD across development.
The present study consists of two manuscripts that explored how joint attention in children with ASD related to sensory responsiveness and social competence (Study 1), and how child joint attention related to mother attention and contextual factors (Study 2). Specifically, Study 1 investigated relations among children's sensory responses, dyadic orienting, joint attention, and their subsequent social competence with peers. Participants were 38 children (18 children with autism spectrum disorder (ASD) and 20 developmentally matched children with typical development) between the ages of 2.75 and 6.5 years. Observational coding was conducted to assess children's joint attention and dyadic orienting in a structured social communication task. Children's sensory responses and social competence were measured with parent report. Group differences were observed in children's joint attention, sensory responses, multisensory dyadic orienting, and social competence, with the ASD group showing significantly greater social impairment and sensory responses compared with their typical peers. Atypical sensory responses were negatively associated with individual differences on social competence subscales. Interaction effects were observed between diagnostic group and sensory responses with diagnostic group moderating the relation between sensory responses and both joint attention and social competence abilities.
Study 2 investigated relations between child joint attention and mother attention during three social contexts (competing demands, teaching, and free play) among 44 children with ASD between the ages of 2.5 and 5.6 years, and their mothers. Observational coding was conducted to assess children’s joint attention and mother’s dyadic orienting. Children’s expressive and receptive language was measured by teacher report. The rate of children’s joint attention, and mothers’ dyadic orienting differed depending on the context of their interaction. Children’s joint attention, expressive and receptive language, age, and ASD severity, and mother dyadic orienting were related, and these relations differed by context. Child initiating joint attention (IJA) was also related to mother attention, and this relation was moderated by the child’s expressive and receptive language. A temporal contingency was revealed for the association between child IJA and mother attention with a bi-directional association such that child IJA predicted subsequent mother attention, and mother attention predicted subsequent child IJA. When the sample was split by children’s language ability (i.e., minimally-verbal and verbal groups) there was a group by receptive language, and a group by expressive language interaction on the contingency between child IJA and subsequent mother attention.
Conclusion. The results from study 1 and study 2 suggest that individual differences in children with ASD, including their sensory responses and social competence, as well as mother attention and contextual factors are related to children’s joint attention. When addressing theory and interventions for children with ASD, it is important to consider children’s language and sensory sensitivities, the demands of the interactive context, and factors related to mother attention and approach to her child
人間機械系の相互作用特性の理解に基づく行動学習支援に関する研究
この博士論文は、全文公表に適さないやむを得ない事由があり要約のみを公表していましたが、解消したため、令和2(2020)年4月20日に全文を公表しました。筑波大学 (University of Tsukuba)201
- …