245 research outputs found

    Machine Learning and Virtual Reality on Body MovementsÂż Behaviors to Classify Children with Autism Spectrum Disorder

    Full text link
    [EN] Autism spectrum disorder (ASD) is mostly diagnosed according to behavioral symptoms in sensory, social, and motor domains. Improper motor functioning, during diagnosis, involves the qualitative evaluation of stereotyped and repetitive behaviors, while quantitative methods that classify body movements' frequencies of children with ASD are less addressed. Recent advances in neuroscience, technology, and data analysis techniques are improving the quantitative and ecological validity methods to measure specific functioning in ASD children. On one side, cutting-edge technologies, such as cameras, sensors, and virtual reality can accurately detect and classify behavioral biomarkers, as body movements in real-life simulations. On the other, machine-learning techniques are showing the potential for identifying and classifying patients' subgroups. Starting from these premises, three real-simulated imitation tasks have been implemented in a virtual reality system whose aim is to investigate if machine-learning methods on movement features and frequency could be useful in discriminating ASD children from children with typical neurodevelopment. In this experiment, 24 children with ASD and 25 children with typical neurodevelopment participated in a multimodal virtual reality experience, and changes in their body movements were tracked by a depth sensor camera during the presentation of visual, auditive, and olfactive stimuli. The main results showed that ASD children presented larger body movements than TD children, and that head, trunk, and feet represent the maximum classification with an accuracy of 82.98%. Regarding stimuli, visual condition showed the highest accuracy (89.36%), followed by the visual-auditive stimuli (74.47%), and visual-auditive-olfactory stimuli (70.21%). Finally, the head showed the most consistent performance along with the stimuli, from 80.85% in visual to 89.36% in visual-auditive-olfactory condition. The findings showed the feasibility of applying machine learning and virtual reality to identify body movements' biomarkers that could contribute to improving ASD diagnosis.This work was supported by the Spanish Ministry of Economy, Industry, and Competitiveness funded project "Immersive virtual environment for the evaluation and training of children with autism spectrum disorder: T Room" (IDI-20170912) and by the Generalitat Valenciana funded project REBRAND (PROMETEO/2019/105). Furthermore, this work was co-founded by the European Union through the Operational Program of the European Regional development Fund (ERDF) of the Valencian Community 2014-2020 (IDIFEDER/2018/029).Alcañiz Raya, ML.; MarĂ­n-Morales, J.; Minissi, ME.; Teruel Garcia, G.; Abad, L.; Chicchi-Giglioli, IA. (2020). Machine Learning and Virtual Reality on Body MovementsÂż Behaviors to Classify Children with Autism Spectrum Disorder. Journal of Clinical Medicine. 9(5):1-20. https://doi.org/10.3390/jcm9051260S12095https://www.who.int/news-room/fact-sheets/detail/autism-spectrum-disordersAnagnostou, E., Zwaigenbaum, L., Szatmari, P., Fombonne, E., Fernandez, B. A., Woodbury-Smith, M., 
 Scherer, S. W. (2014). Autism spectrum disorder: advances in evidence-based practice. Canadian Medical Association Journal, 186(7), 509-519. doi:10.1503/cmaj.121756Lord, C., Risi, S., DiLavore, P. S., Shulman, C., Thurm, A., & Pickles, A. (2006). Autism From 2 to 9 Years of Age. Archives of General Psychiatry, 63(6), 694. doi:10.1001/archpsyc.63.6.694Schmidt, L., Kirchner, J., Strunz, S., BroĆșus, J., Ritter, K., Roepke, S., & Dziobek, I. (2015). Psychosocial Functioning and Life Satisfaction in Adults With Autism Spectrum Disorder Without Intellectual Impairment. Journal of Clinical Psychology, 71(12), 1259-1268. doi:10.1002/jclp.22225Turner, M. (1999). Annotation: Repetitive Behaviour in Autism: A Review of Psychological Research. Journal of Child Psychology and Psychiatry, 40(6), 839-849. doi:10.1111/1469-7610.00502Lewis, M. H., & Bodfish, J. W. (1998). Repetitive behavior disorders in autism. Mental Retardation and Developmental Disabilities Research Reviews, 4(2), 80-89. doi:10.1002/(sici)1098-2779(1998)4:23.0.co;2-0Mahone, E. M., Bridges, D., Prahme, C., & Singer, H. S. (2004). Repetitive arm and hand movements (complex motor stereotypies) in children. The Journal of Pediatrics, 145(3), 391-395. doi:10.1016/j.jpeds.2004.06.014MacDonald, R., Green, G., Mansfield, R., Geckeler, A., Gardenier, N., Anderson, J., 
 Sanchez, J. (2007). Stereotypy in young children with autism and typically developing children. Research in Developmental Disabilities, 28(3), 266-277. doi:10.1016/j.ridd.2006.01.004Singer, H. S. (2009). Motor Stereotypies. Seminars in Pediatric Neurology, 16(2), 77-81. doi:10.1016/j.spen.2009.03.008Lidstone, J., Uljarević, M., Sullivan, J., Rodgers, J., McConachie, H., Freeston, M., 
 Leekam, S. (2014). Relations among restricted and repetitive behaviors, anxiety and sensory features in children with autism spectrum disorders. Research in Autism Spectrum Disorders, 8(2), 82-92. doi:10.1016/j.rasd.2013.10.001GOLDMAN, S., WANG, C., SALGADO, M. W., GREENE, P. E., KIM, M., & RAPIN, I. (2009). Motor stereotypies in children with autism and other developmental disorders. Developmental Medicine & Child Neurology, 51(1), 30-38. doi:10.1111/j.1469-8749.2008.03178.xLord, C., Rutter, M., & Le Couteur, A. (1994). Autism Diagnostic Interview-Revised: A revised version of a diagnostic interview for caregivers of individuals with possible pervasive developmental disorders. Journal of Autism and Developmental Disorders, 24(5), 659-685. doi:10.1007/bf02172145Volkmar, F. R., State, M., & Klin, A. (2009). Autism and autism spectrum disorders: diagnostic issues for the coming decade. Journal of Child Psychology and Psychiatry, 50(1-2), 108-115. doi:10.1111/j.1469-7610.2008.02010.xReaven, J. A., Hepburn, S. L., & Ross, R. G. (2008). Use of the ADOS and ADI-R in Children with Psychosis: Importance of Clinical Judgment. Clinical Child Psychology and Psychiatry, 13(1), 81-94. doi:10.1177/1359104507086343Torres, E. B., Brincker, M., Isenhower, R. W., Yanovich, P., Stigler, K. A., Nurnberger, J. I., 
 JosĂ©, J. V. (2013). Autism: the micro-movement perspective. Frontiers in Integrative Neuroscience, 7. doi:10.3389/fnint.2013.00032Möricke, E., Buitelaar, J. K., & Rommelse, N. N. J. (2015). Do We Need Multiple Informants When Assessing Autistic Traits? The Degree of Report Bias on Offspring, Self, and Spouse Ratings. Journal of Autism and Developmental Disorders, 46(1), 164-175. doi:10.1007/s10803-015-2562-yCHAYTOR, N., SCHMITTEREDGECOMBE, M., & BURR, R. (2006). Improving the ecological validity of executive functioning assessment. Archives of Clinical Neuropsychology, 21(3), 217-227. doi:10.1016/j.acn.2005.12.002Brunswik, E. (1955). Representative design and probabilistic theory in a functional psychology. Psychological Review, 62(3), 193-217. doi:10.1037/h0047470Gillberg, C., & Rasmussen, P. (1994). Brief report: Four case histories and a literature review of williams syndrome and autistic behavior. Journal of Autism and Developmental Disorders, 24(3), 381-393. doi:10.1007/bf02172235Parsons, S. (2016). Authenticity in Virtual Reality for assessment and intervention in autism: A conceptual review. Educational Research Review, 19, 138-157. doi:10.1016/j.edurev.2016.08.001Francis, K. (2005). Autism interventions: a critical update. Developmental Medicine & Child Neurology, 47(7), 493-499. doi:10.1017/s0012162205000952Albinali, F., Goodwin, M. S., & Intille, S. S. (2009). Recognizing stereotypical motor movements in the laboratory and classroom. Proceedings of the 11th international conference on Ubiquitous computing. doi:10.1145/1620545.1620555Pyles, D. A. M., Riordan, M. M., & Bailey, J. S. (1997). The stereotypy analysis: An instrument for examining environmental variables associated with differential rates of stereotypic behavior. Research in Developmental Disabilities, 18(1), 11-38. doi:10.1016/s0891-4222(96)00034-0Nosek, B. A., Hawkins, C. B., & Frazier, R. S. (2011). Implicit social cognition: from measures to mechanisms. Trends in Cognitive Sciences, 15(4), 152-159. doi:10.1016/j.tics.2011.01.005Forscher, P. S., Lai, C. K., Axt, J. R., Ebersole, C. R., Herman, M., Devine, P. G., & Nosek, B. A. (2019). A meta-analysis of procedures to change implicit measures. Journal of Personality and Social Psychology, 117(3), 522-559. doi:10.1037/pspa0000160LeDoux, J. E., & Pine, D. S. (2016). Using Neuroscience to Help Understand Fear and Anxiety: A Two-System Framework. American Journal of Psychiatry, 173(11), 1083-1093. doi:10.1176/appi.ajp.2016.16030353Fenning, R. M., Baker, J. K., Baucom, B. R., Erath, S. A., Howland, M. A., & Moffitt, J. (2017). Electrodermal Variability and Symptom Severity in Children with Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 47(4), 1062-1072. doi:10.1007/s10803-016-3021-0Nikula, R. (1991). Psychological Correlates of Nonspecific Skin Conductance Responses. Psychophysiology, 28(1), 86-90. doi:10.1111/j.1469-8986.1991.tb03392.xAlcañiz Raya, M., Chicchi Giglioli, I. A., MarĂ­n-Morales, J., Higuera-Trujillo, J. L., Olmos, E., Minissi, M. E., 
 Abad, L. (2020). Application of Supervised Machine Learning for Behavioral Biomarkers of Autism Spectrum Disorder Based on Electrodermal Activity and Virtual Reality. Frontiers in Human Neuroscience, 14. doi:10.3389/fnhum.2020.00090Cunningham, W. A., Raye, C. L., & Johnson, M. K. (2004). Implicit and Explicit Evaluation: fMRI Correlates of Valence, Emotional Intensity, and Control in the Processing of Attitudes. Journal of Cognitive Neuroscience, 16(10), 1717-1729. doi:10.1162/0898929042947919Kopton, I. M., & Kenning, P. (2014). Near-infrared spectroscopy (NIRS) as a new tool for neuroeconomic research. Frontiers in Human Neuroscience, 8. doi:10.3389/fnhum.2014.00549Nickel, P., & Nachreiner, F. (2003). Sensitivity and Diagnosticity of the 0.1-Hz Component of Heart Rate Variability as an Indicator of Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 45(4), 575-590. doi:10.1518/hfes.45.4.575.27094Di Martino, A., Yan, C.-G., Li, Q., Denio, E., Castellanos, F. X., Alaerts, K., 
 Milham, M. P. (2013). The autism brain imaging data exchange: towards a large-scale evaluation of the intrinsic brain architecture in autism. Molecular Psychiatry, 19(6), 659-667. doi:10.1038/mp.2013.78Van Hecke, A. V., Lebow, J., Bal, E., Lamb, D., Harden, E., Kramer, A., 
 Porges, S. W. (2009). Electroencephalogram and Heart Rate Regulation to Familiar and Unfamiliar People in Children With Autism Spectrum Disorders. Child Development, 80(4), 1118-1133. doi:10.1111/j.1467-8624.2009.01320.xCoronato, A., De Pietro, G., & Paragliola, G. (2014). A situation-aware system for the detection of motion disorders of patients with Autism Spectrum Disorders. Expert Systems with Applications, 41(17), 7868-7877. doi:10.1016/j.eswa.2014.05.011Goodwin, M. S., Intille, S. S., Albinali, F., & Velicer, W. F. (2010). Automated Detection of Stereotypical Motor Movements. Journal of Autism and Developmental Disorders, 41(6), 770-782. doi:10.1007/s10803-010-1102-zRodrigues, J. L., Gonçalves, N., Costa, S., & Soares, F. (2013). Stereotyped movement recognition in children with ASD. Sensors and Actuators A: Physical, 202, 162-169. doi:10.1016/j.sna.2013.04.019Crippa, A., Salvatore, C., Perego, P., Forti, S., Nobile, M., Molteni, M., & Castiglioni, I. (2015). Use of Machine Learning to Identify Children with Autism and Their Motor Abnormalities. Journal of Autism and Developmental Disorders, 45(7), 2146-2156. doi:10.1007/s10803-015-2379-8Wedyan, M., Al-Jumaily, A., & Crippa, A. (2019). Using machine learning to perform early diagnosis of Autism Spectrum Disorder based on simple upper limb movements. International Journal of Hybrid Intelligent Systems, 15(4), 195-206. doi:10.3233/his-190278Parsons, S., Mitchell, P., & Leonard, A. (2004). The Use and Understanding of Virtual Environments by Adolescents with Autistic Spectrum Disorders. Journal of Autism and Developmental Disorders, 34(4), 449-466. doi:10.1023/b:jadd.0000037421.98517.8dParsons, T. D., Rizzo, A. A., Rogers, S., & York, P. (2009). Virtual reality in paediatric rehabilitation: A review. Developmental Neurorehabilitation, 12(4), 224-238. doi:10.1080/17518420902991719Bowman, D. A., Gabbard, J. L., & Hix, D. (2002). A Survey of Usability Evaluation in Virtual Environments: Classification and Comparison of Methods. Presence: Teleoperators and Virtual Environments, 11(4), 404-424. doi:10.1162/105474602760204309Pastorelli, E., & Herrmann, H. (2013). A Small-scale, Low-budget Semi-immersive Virtual Environment for Scientific Visualization and Research. Procedia Computer Science, 25, 14-22. doi:10.1016/j.procs.2013.11.003Cobb, S. V. G., Nichols, S., Ramsey, A., & Wilson, J. R. (1999). Virtual Reality-Induced Symptoms and Effects (VRISE). Presence: Teleoperators and Virtual Environments, 8(2), 169-186. doi:10.1162/105474699566152Wallace, S., Parsons, S., Westbury, A., White, K., White, K., & Bailey, A. (2010). Sense of presence and atypical social judgments in immersive virtual environments. Autism, 14(3), 199-213. doi:10.1177/1362361310363283Lorenzo, G., LledĂł, A., ArrĂĄez-Vera, G., & Lorenzo-LledĂł, A. (2018). The application of immersive virtual reality for students with ASD: A review between 1990–2017. Education and Information Technologies, 24(1), 127-151. doi:10.1007/s10639-018-9766-7Bailenson, J. N., Yee, N., Merget, D., & Schroeder, R. (2006). The Effect of Behavioral Realism and Form Realism of Real-Time Avatar Faces on Verbal Disclosure, Nonverbal Disclosure, Emotion Recognition, and Copresence in Dyadic Interaction. Presence: Teleoperators and Virtual Environments, 15(4), 359-372. doi:10.1162/pres.15.4.359Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Frontiers in Psychology, 9. doi:10.3389/fpsyg.2018.02086Cummings, J. J., & Bailenson, J. N. (2015). How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence. Media Psychology, 19(2), 272-309. doi:10.1080/15213269.2015.1015740Skalski, P., & Tamborini, R. (2007). The Role of Social Presence in Interactive Agent-Based Persuasion. Media Psychology, 10(3), 385-413. doi:10.1080/15213260701533102Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557. doi:10.1098/rstb.2009.0138Baños, R. M., Botella, C., Garcia-Palacios, A., Villa, H., Perpiña, C., & Alcañiz, M. (2000). Presence and Reality Judgment in Virtual Environments: A Unitary Construct? CyberPsychology & Behavior, 3(3), 327-335. doi:10.1089/10949310050078760Bente, G., RĂŒggenberg, S., KrĂ€mer, N. C., & Eschenburg, F. (2008). Avatar-Mediated Networking: Increasing Social Presence and Interpersonal Trust in Net-Based Collaborations. Human Communication Research, 34(2), 287-318. doi:10.1111/j.1468-2958.2008.00322.xHeeter, C. (1992). Being There: The Subjective Experience of Presence. Presence: Teleoperators and Virtual Environments, 1(2), 262-271. doi:10.1162/pres.1992.1.2.262Sanchez-Vives, M. V., & Slater, M. (2005). From presence to consciousness through virtual reality. Nature Reviews Neuroscience, 6(4), 332-339. doi:10.1038/nrn1651Mohr, D. C., Burns, M. N., Schueller, S. M., Clarke, G., & Klinkman, M. (2013). Behavioral Intervention Technologies: Evidence review and recommendations for future research in mental health. General Hospital Psychiatry, 35(4), 332-338. doi:10.1016/j.genhosppsych.2013.03.008Neguț, A., Matu, S.-A., Sava, F. A., & David, D. (2016). Virtual reality measures in neuropsychological assessment: a meta-analytic review. The Clinical Neuropsychologist, 30(2), 165-184. doi:10.1080/13854046.2016.1144793Riva, G. (2005). Virtual Reality in Psychotherapy: Review. CyberPsychology & Behavior, 8(3), 220-230. doi:10.1089/cpb.2005.8.220Valmaggia, L. R., Latif, L., Kempton, M. J., & Rus-Calafell, M. (2016). Virtual reality in the psychological treatment for mental health problems: An systematic review of recent evidence. Psychiatry Research, 236, 189-195. doi:10.1016/j.psychres.2016.01.015Mesa-Gresa, P., Gil-GĂłmez, H., Lozano-Quilis, J.-A., & Gil-GĂłmez, J.-A. (2018). Effectiveness of Virtual Reality for Children and Adolescents with Autism Spectrum Disorder: An Evidence-Based Systematic Review. Sensors, 18(8), 2486. doi:10.3390/s18082486Cheng, Y., & Ye, J. (2010). Exploring the social competence of students with autism spectrum conditions in a collaborative virtual learning environment – The pilot study. Computers & Education, 54(4), 1068-1077. doi:10.1016/j.compedu.2009.10.011Jarrold, W., Mundy, P., Gwaltney, M., Bailenson, J., Hatt, N., McIntyre, N., 
 Swain, L. (2013). Social Attention in a Virtual Public Speaking Task in Higher Functioning Children With Autism. Autism Research, 6(5), 393-410. doi:10.1002/aur.1302Forgeot d’Arc, B., Ramus, F., Lefebvre, A., Brottier, D., Zalla, T., Moukawane, S., 
 Delorme, R. (2014). Atypical Social Judgment and Sensitivity to Perceptual Cues in Autism Spectrum Disorders. Journal of Autism and Developmental Disorders, 46(5), 1574-1581. doi:10.1007/s10803-014-2208-5Maskey, M., Lowry, J., Rodgers, J., McConachie, H., & Parr, J. R. (2014). Reducing Specific Phobia/Fear in Young People with Autism Spectrum Disorders (ASDs) through a Virtual Reality Environment Intervention. PLoS ONE, 9(7), e100374. doi:10.1371/journal.pone.0100374Baron-Cohen, S., Ashwin, E., Ashwin, C., Tavassoli, T., & Chakrabarti, B. (2009). Talent in autism: hyper-systemizing, hyper-attention to detail and sensory hypersensitivity. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1522), 1377-1383. doi:10.1098/rstb.2008.0337Tomchek, S. D., Huebner, R. A., & Dunn, W. (2014). Patterns of sensory processing in children with an autism spectrum disorder. Research in Autism Spectrum Disorders, 8(9), 1214-1224. doi:10.1016/j.rasd.2014.06.006Boyd, B. A., Baranek, G. T., Sideris, J., Poe, M. D., Watson, L. R., Patten, E., & Miller, H. (2010). Sensory features and repetitive behaviors in children with autism and developmental delays. Autism Research, n/a-n/a. doi:10.1002/aur.124Gabriels, R. L., Agnew, J. A., Miller, L. J., Gralla, J., Pan, Z., Goldson, E., 
 Hooks, E. (2008). Is there a relationship between restricted, repetitive, stereotyped behaviors and interests and abnormal sensory response in children with autism spectrum disorders? Research in Autism Spectrum Disorders, 2(4), 660-670. doi:10.1016/j.rasd.2008.02.002Cao, Z., Hidalgo, G., Simon, T., Wei, S.-E., & Sheikh, Y. (2021). OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(1), 172-186. doi:10.1109/tpami.2019.2929257Schölkopf, B., Smola, A. J., Williamson, R. C., & Bartlett, P. L. (2000). New Support Vector Algorithms. Neural Computation, 12(5), 1207-1245. doi:10.1162/089976600300015565Yan, K., & Zhang, D. (2015). Feature selection and analysis on correlated gas sensor data with recursive feature elimination. Sensors and Actuators B: Chemical, 212, 353-363. doi:10.1016/j.snb.2015.02.025Chang, C.-C., & Lin, C.-J. (2011). LIBSVM. ACM Transactions on Intelligent Systems and Technology, 2(3), 1-27. doi:10.1145/1961189.1961199O’Neill, M., & Jones, R. S. P. (1997). Journal of Autism and Developmental Disorders, 27(3), 283-293. doi:10.1023/a:1025850431170Foss-Feig, J. H., Kwakye, L. D., Cascio, C. J., Burnette, C. P., Kadivar, H., Stone, W. L., & Wallace, M. T. (2010). An extended multisensory temporal binding window in autism spectrum disorders. Experimental Brain Research, 203(2), 381-389. doi:10.1007/s00221-010-2240-4Courchesne, E., Lincoln, A. J., Kilman, B. A., & Galambos, R. (1985). Event-related brain potential correlates of the processing of novel visual and auditory information in autism. Journal of Autism and Developmental Disorders, 15(1), 55-76. doi:10.1007/bf01837899Russo, N., Foxe, J. J., Brandwein, A. B., Altschuler, T., Gomes, H., & Molholm, S. (2010). Multisensory processing in children with autism: high-density electrical mapping of auditory-somatosensory integration. Autism Research, 3(5), 253-267. doi:10.1002/aur.152Ament, K., Mejia, A., Buhlman, R., Erklin, S., Caffo, B., Mostofsky, S., & Wodka, E. (2014). Evidence for Specificity of Motor Impairments in Catching and Balance in Children with Autism. Journal of Autism and Developmental Disorders, 45(3), 742-751. doi:10.1007/s10803-014-2229-

    Upset or Collapse Detection System for ASD Children Using Smart Watch with Machine Learning Algorithm

    Get PDF
    ASD is characterised by severe and violent behavioural issues that are referred to as "meltdowns (upset) or tantrums (collapse)" and can include aggression, hyperactivity, intolerance, unpredictability and self-injury. This research work intends to develop and implement a non-invasive real-time Upset or Collapse Detection System (UCDS) for people with ASD. With a certain model of smart watch, the non-invasive biological indications such as Pulse Rate (PR), Skin Temperature (ST), and Galvanic Skin Reaction (GSR) can be artificially captured.  In order to create the UCDS, deep learning algorithms like CNN, LSTM, and the hybrid of CNN-LSTM are given the physiological signals that are captured to a server. The deep learning algorithm could recognise aberrant upset or collapse states from real-time physiological signs after being trained.  Deep learning algorithms including CNN, LSTM, and CNN-LSTM are used to train and test the proposed UCDS system, and it is discovered that hybrid CNN-LSTM beat them all with an average training and testing accuracy of 96% and a low mean absolute error (MAE) of 0.10 for training and 0.04 for testing.  Furthermore, the suggested UCDS system is supported by 93% of the ASD caretakers

    Charlie: A New Robot Prototype for Improving Communication and social Skills in Children with Autism and a New Single-point Infrared Sensor Technique for Detecting bBeathing and Heart Rate Remotely

    Get PDF
    This research delivers a new, interactive game-playing robot named CHARLIE and a novel technique for remotely detecting breathing and heart rate using a single-point, thermal infrared sensor (IR). The robot is equipped with a head and two arms, each with two degrees of freedom, and a camera. We trained a human hands classifier and used this classifier along with a standard face classifier to create two autonomous interactive games: single-player ( Imitate Me, Imitate You ) and two-player ( Pass the Pose ). Further, we developed and implemented a suite of new interactive games in which the robot is teleoperated by remote control. Each of these features has been tested and validated through a field study including eight children diagnosed with autism and speech delays. Results from that study show that significant improvements in speech and social skills can be obtained when using CHARLIE with the methodology described herein. Moreover, gains in communication and social interaction are observed to generalize from child-to-robot to co-present others through the scaffolding of communication skills with the systematic approach developed for the study. Additionally, we present a new IR system that continuously targets the sub-nasal region of the face and measures subtle temperature changes corresponding to breathing and cardiac pulse. This research makes four novel contributions: (1) A low-cost, field-tested robot for use in autism therapy, (2) a suite of interactive robot games, (3) a hand classifier created for performing hand detection during the interactive games, and (4) an IR sensor system which remotely collects temperatures and computes breathing and heart rate. Interactive robot CHARLIE is physically designed to be aesthetically appealing to young children between three and six years of age. The hard, wood and metal robot body is covered with a bright green, fuzzy material and additional padding so that it appears toylike and soft. Additionally, several structural features were included to ensure safety during interactive play and to enhance the robustness of the robot. Because children with autism spectrum disorder (ASD) often enjoy exploring new or interesting objects with their hands, the robot must be able to withstand a moderate amount of physical manipulation without causing injury to the child or damaging the robot or its components. CHARLIE plays five distinct interactive games that are designed to be entertaining to young children, appeal to children of varying developmental ability and promote increased speech and social skill through imitation and turn-taking. Remote breathing and heart rate detection Stress is a compounding factor in autism therapy which can inhibit progress toward specific therapeutic goals. The ability to non-invasively detect physical indicators of increasing stress, especially when they can be correlated to specific activities and measured in terms of length and frequency, can relay important metrics about the antecedents that cause stress for a particular child and can be used to help automate the evaluation of a child\u27s progress between sessions. Further, collecting and measuring critical physiological indicators such as breathing and heart rate can enable robots to adjust their behavior based on the perceived emotional, psychological or physical state of their user. The utility and acceptance of robots can be further increased when they are able to learn typical physiological patterns and use these patterns as a baseline for identifying anomalies or possible warning signs of various problems in their human users. We present a new technique for remotely collecting and analyzing breathing and heart rates in real time using an autonomous, low cost infrared (IR) sensor system. This is accomplished by continuously targeting a high precision IR sensor, tracking changes in the sub-nasal skin surface temperature and employing a sinusoidal curve-fitting function, Fast Fourier Transform (FFT), and Discrete Wavelet Transform (DWT) to extract the breathing and heart rate from recorded temperatures

    Olly: A tangible for togetherness

    Get PDF
    This research explores how tangible interactive technology might offer opportunities for socialization and sensory regulation. We present a study carried out in an educational setting during leisure activities with a small group of children with autism who like music. We introduce ÎŒÎ»ÎżÎč (pronounced Olly), a sonic textile Tangible User Interface (TUI) designed around the observations of five minimally verbal children with autism aged between 5-10 years. The TUI was tested for an average of 24 minutes once per week, over a period of five weeks in a specialized school based in North-East London, UK. We propose a methodological approach that embraces diversity and promotes designs that support repetitive movements and self-regulation to provide the children with a favorable environment and tools to socialize with peers. The findings show positive outcomes with regards to spontaneous social interactions between peers particularly when children interacted with or around Olly. These were observed in the form of eye-contact, turn-taking, sharing (of the space, the object and experience), and more complex social play dynamics like associative and cooperative play. We illustrate how the TUI was a positive stimulus of social behaviors and discuss design implications for novel technologies that aim to foster shared experiences between children with autism

    Analysis and enhancement of interpersonal coordination using inertial measurement unit solutions

    Get PDF
    Die heutigen mobilen Kommunikationstechnologien haben den Umfang der verbalen und textbasierten Kommunikation mit anderen Menschen, sozialen Robotern und kĂŒnstlicher Intelligenz erhöht. Auf der anderen Seite reduzieren diese Technologien die nonverbale und die direkte persönliche Kommunikation, was zu einer gesellschaftlichen Thematik geworden ist, weil die Verringerung der direkten persönlichen Interaktionen eine angemessene Wahrnehmung sozialer und umgebungsbedingter Reizmuster erschweren und die Entwicklung allgemeiner sozialer FĂ€higkeiten bremsen könnte. Wissenschaftler haben aktuell die Bedeutung nonverbaler zwischenmenschlicher AktivitĂ€ten als soziale FĂ€higkeiten untersucht, indem sie menschliche Verhaltensmuster in Zusammenhang mit den jeweilgen neurophysiologischen Aktivierungsmustern analzsiert haben. Solche QuerschnittsansĂ€tze werden auch im Forschungsprojekt der EuropĂ€ischen Union "Socializing sensori-motor contingencies" (socSMCs) verfolgt, das darauf abzielt, die LeistungsfĂ€higkeit sozialer Roboter zu verbessern und Autismus-Spektrumsstörungen (ASD) adĂ€quat zu behandeln. In diesem Zusammenhang ist die Modellierung und das Benchmarking des Sozialverhaltens gesunder Menschen eine Grundlage fĂŒr theorieorientierte und experimentelle Studien zum weiterfĂŒhrenden VerstĂ€ndnis und zur UnterstĂŒtzung interpersoneller Koordination. In diesem Zusammenhang wurden zwei verschiedene empirische Kategorien in AbhĂ€ngigkeit von der Entfernung der Interagierenden zueinander vorgeschlagen: distale vs. proximale Interaktionssettings, da sich die Struktur der beteiligten kognitiven Systeme zwischen den Kategorien Ă€ndert und sich die Ebene der erwachsenden socSMCs verschiebt. Da diese Dissertation im Rahmen des socSMCs-Projekts entstanden ist, wurden Interaktionssettings fĂŒr beide Kategorien (distal und proximal) entwickelt. Zudem wurden Ein-Sensor-Lösungen zur Reduzierung des Messaufwands (und auch der Kosten) entwickelt, um eine Messung ausgesuchter Verhaltensparameter bei einer Vielzahl von Menschen und sozialen Interaktionen zu ermöglichen. ZunĂ€chst wurden Algorithmen fĂŒr eine kopfgetragene TrĂ€gheitsmesseinheit (H-IMU) zur Messung der menschlichen Kinematik als eine Ein-Sensor-Lösung entwickelt. Die Ergebnisse bestĂ€tigten, dass die H-IMU die eigenen Gangparameter unabhĂ€ngig voneinander allein auf Basis der Kopfkinematik messen kann. Zweitens wurden—als ein distales socSMC-Setting—die interpersonellen Kopplungen mit einem Bezug auf drei interagierende Merkmale von „Übereinstimmung“ (engl.: rapport) behandelt: PositivitĂ€t, gegenseitige Aufmerksamkeit und Koordination. Die H-IMUs ĂŒberwachten bestimmte soziale Verhaltensereignisse, die sich auf die Kinematik der Kopforientierung und Oszillation wĂ€hrend des Gehens und Sprechens stĂŒtzen, so dass der Grad der Übereinstimmung geschĂ€tzt werden konnte. Schließlich belegten die Ergebnisse einer experimentellen Studie, die zu einer kollaborativen Aufgabe mit der entwickelten IMU-basierten Tablet-Anwendung durchgefĂŒhrt wurde, unterschiedliche Wirkungen verschiedener audio-motorischer Feedbackformen fĂŒr eine UnterstĂŒtzung der interpersonellen Koordination in der Kategorie proximaler sensomotorischer Kontingenzen. Diese Dissertation hat einen intensiven interdisziplinĂ€ren Charakter: Technologische Anforderungen in den Bereichen der Sensortechnologie und der Softwareentwicklung mussten in direktem Bezug auf vordefinierte verhaltenswissenschaftliche Fragestellungen entwickelt und angewendet bzw. gelöst werden—und dies in zwei unterschiedlichen DomĂ€nen (distal, proximal). Der gegebene Bezugsrahmen wurde als eine große Herausforderung bei der Entwicklung der beschriebenen Methoden und Settings wahrgenommen. Die vorgeschlagenen IMU-basierten Lösungen könnten dank der weit verbreiteten IMU-basierten mobilen GerĂ€te zukĂŒnftig in verschiedene Anwendungen perspektiv reich integriert werden.Today’s mobile communication technologies have increased verbal and text-based communication with other humans, social robots and intelligent virtual assistants. On the other hand, the technologies reduce face-to-face communication. This social issue is critical because decreasing direct interactions may cause difficulty in reading social and environmental cues, thereby impeding the development of overall social skills. Recently, scientists have studied the importance of nonverbal interpersonal activities to social skills, by measuring human behavioral and neurophysiological patterns. These interdisciplinary approaches are in line with the European Union research project, “Socializing sensorimotor contingencies” (socSMCs), which aims to improve the capability of social robots and properly deal with autism spectrum disorder (ASD). Therefore, modelling and benchmarking healthy humans’ social behavior are fundamental to establish a foundation for research on emergence and enhancement of interpersonal coordination. In this research project, two different experimental settings were categorized depending on interactants’ distance: distal and proximal settings, where the structure of engaged cognitive systems changes, and the level of socSMCs differs. As a part of the project, this dissertation work referred to this spatial framework. Additionally, single-sensor solutions were developed to reduce costs and efforts in measuring human behaviors, recognizing the social behaviors, and enhancing interpersonal coordination. First of all, algorithms using a head worn inertial measurement unit (H-IMU) were developed to measure human kinematics, as a baseline for social behaviors. The results confirmed that the H-IMU can measure individual gait parameters by analyzing only head kinematics. Secondly, as a distal sensorimotor contingency, interpersonal relationship was considered with respect to a dynamic structure of three interacting components: positivity, mutual attentiveness, and coordination. The H-IMUs monitored the social behavioral events relying on kinematics of the head orientation and oscillation during walk and talk, which can contribute to estimate the level of rapport. Finally, in a new collaborative task with the proposed IMU-based tablet application, results verified effects of different auditory-motor feedbacks on the enhancement of interpersonal coordination in a proximal setting. This dissertation has an intensive interdisciplinary character: Technological development, in the areas of sensor and software engineering, was required to apply to or solve issues in direct relation to predefined behavioral scientific questions in two different settings (distal and proximal). The given frame served as a reference in the development of the methods and settings in this dissertation. The proposed IMU-based solutions are also promising for various future applications due to widespread wearable devices with IMUs.European Commission/HORIZON2020-FETPROACT-2014/641321/E

    Feasibility of a smartphone application to identify young children at risk for Autism Spectrum Disorder in a low-income community setting in South Africa

    Get PDF
    Introduction and aims More than 90% of children with Autism Spectrum Disorder (ASD) live in low- and middle-income countries (LMIC) where there is a great need for culturally appropriate, scalable and effective early identification and intervention tools. Smartphone technology and application (‘apps’) may potentially play an important role in this regard. The Autism&Beyond iPhone App was designed as a potential screening tool for ASD risk in children aged 12-72 months. Here we investigated the technical feasibility and cultural acceptability of a smartphone app to determine risk for ASD in children aged 12-72 months in a naturalistic, low-income South African community setting. Methodology 37 typically-developing African children and their parents/carers were recruited from community centres in Khayelitsha Township, Cape Town, South Africa. We implemented a mixed-methods design, collecting both quantitative and qualitative data from participants in 2 stages. In stage 1, we collected quantitative data. With appropriate ethics and consent, parents completed a short technology questionnaire about their familiarity with and access to smartphones, internet and apps, followed by electronic iPhone-based demographic and ASD-related questionnaires. Next, children were shown 3 short videos of 30s each and a mirror stimulus on a study smartphone. The smartphone front facing (“selfie”) camera recorded video of the child’s facial expressions and head movement. Automated computer algorithms quantified positive emotions and time attending to stimuli. We validated the automatic coding by a) comparing the computer-generated analysis to human coding of facial expressions in a random sample (N=9), and b) comparing automated analysis of the South African data (N=33) with a matched American sample (N=33). In stage 2, a subset of families were invited to participate in focus group discussions to provide qualitative data on accessibility, acceptability, and cultural appropriateness of the app in their local community. Results Most parents (64%) owned a smartphone of which all (100%) were Android based, and many used Apps (45%). Human-automated coding showed excellent correlation for positive emotion (ICC= 0.95, 95% CI 0.81-0.99) and no statistically significant differences were observed between the South African and American sample in % time attending to the video stimuli. South African children, however, smiled less at the Toys&Rhymes (SA mean (SD) = 14% (24); USA mean (SD) = 31% (34); p=0.05) and Bunny video (SA mean (SD) = 12% (17); USA mean (SD) = 30% (0.27); p=0.006). Analysis of focus group data indicated that parents/carers found the App relatively easy to use, and would recommend it to others in their community provided the App and data transfer were free. Conclusion The results from this pilot study suggested the App to be technically accurate, accessible and culturally acceptable to families from a low-resource environment in South Africa. Given the differences in positive emotional response between the groups, careful consideration should be given to identify suitable stimuli if % time smiling is to be used as a global marker for autism risk across cultures and environments
    • 

    corecore