1,284 research outputs found

    Master of Science

    Get PDF
    thesisStroke is a leading cause of death and adult disability in the United States. Survivors lose abilities that were controlled by the affected area of the brain. Rehabilitation therapy is administered to help survivors regain control of lost functional abilities. The number of sessions that stroke survivors attend are limited to the availability of a clinic close to their residence and the amount of time friends and family can devote to help them commute, as most are incapable of driving. Home-based therapy using virtual reality and computer games have the potential of solving these issues, increasing the amount of independent therapy performed by patients. This thesis presents the design, development and testing of a low-cost system, potentially suitable for use in the home environment. This system is designed for rehabilitation of the impaired upper limb of stroke survivors. A Microsoft Kinect was used to track the position of the patient's hand and the game requires the user to move the arm over increasing large areas by sliding the arm on a support. Studies were performed with six stroke survivors and five control subjects to determine the feasibility of the system. Patients played the game for 6 to 10 days and their game scores, range of motion and Fugl-Meyer scores were recorded for analysis. Statistically significant (p<0.05) differences were found between the game scores of the first and last day of the study. Furthermore, acceptability surveys revealed patients enjoyed playing the game, found this kind of therapy more enjoyable than conventional therapy and were willing to use the system in the home environment. Future work in the system will be focused on larger studies, improving the comfort of patients while playing the game, and developing new games that address cognitive issues and integrate art and therapy

    Vision-Based Observation Models for Lower Limb 3D Tracking with a Moving Platform

    Get PDF
    Tracking and understanding human gait is an important step towards improving elderly mobility and safety. This thesis presents a vision-based tracking system that estimates the 3D pose of a wheeled walker user's lower limbs with cameras mounted on the moving walker. The tracker estimates 3D poses from images of the lower limbs in the coronal plane in a dynamic, uncontrolled environment. It employs a probabilistic approach based on particle filtering with three different camera setups: a monocular RGB camera, binocular RGB cameras, and a depth camera. For the RGB cameras, observation likelihoods are designed to compare the colors and gradients of each frame with initial templates that are manually extracted. Two strategies are also investigated for handling appearance change of tracking target: increasing number of templates and using different representations of colors. For the depth camera, two observation likelihoods are developed: the first one works directly in the 3D space, while the second one works in the projected image space. Experiments are conducted to evaluate the performance of the tracking system with different users for all three camera setups. It is demonstrated that the trackers with the RGB cameras produce results with higher error as compared to the depth camera, and the strategies for handling appearance change improve tracking accuracy in general. On the other hand, the tracker with the depth sensor successfully tracks the 3D poses of users over the entire video sequence and is robust against unfavorable conditions such as partial occlusion, missing observations, and deformable tracking target

    Augmented Reality

    Get PDF
    Augmented Reality (AR) is a natural development from virtual reality (VR), which was developed several decades earlier. AR complements VR in many ways. Due to the advantages of the user being able to see both the real and virtual objects simultaneously, AR is far more intuitive, but it's not completely detached from human factors and other restrictions. AR doesn't consume as much time and effort in the applications because it's not required to construct the entire virtual scene and the environment. In this book, several new and emerging application areas of AR are presented and divided into three sections. The first section contains applications in outdoor and mobile AR, such as construction, restoration, security and surveillance. The second section deals with AR in medical, biological, and human bodies. The third and final section contains a number of new and useful applications in daily living and learning

    The feet in human--computer interaction: a survey of foot-based interaction

    Get PDF
    Foot-operated computer interfaces have been studied since the inception of human--computer interaction. Thanks to the miniaturisation and decreasing cost of sensing technology, there is an increasing interest exploring this alternative input modality, but no comprehensive overview of its research landscape. In this survey, we review the literature on interfaces operated by the lower limbs. We investigate the characteristics of users and how they affect the design of such interfaces. Next, we describe and analyse foot-based research prototypes and commercial systems in how they capture input and provide feedback. We then analyse the interactions between users and systems from the perspective of the actions performed in these interactions. Finally, we discuss our findings and use them to identify open questions and directions for future research

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    The development of a human-robot interface for industrial collaborative system

    Get PDF
    Industrial robots have been identified as one of the most effective solutions for optimising output and quality within many industries. However, there are a number of manufacturing applications involving complex tasks and inconstant components which prohibit the use of fully automated solutions in the foreseeable future. A breakthrough in robotic technologies and changes in safety legislations have supported the creation of robots that coexist and assist humans in industrial applications. It has been broadly recognised that human-robot collaborative systems would be a realistic solution as an advanced production system with wide range of applications and high economic impact. This type of system can utilise the best of both worlds, where the robot can perform simple tasks that require high repeatability while the human performs tasks that require judgement and dexterity of the human hands. Robots in such system will operate as “intelligent assistants”. In a collaborative working environment, robot and human share the same working area, and interact with each other. This level of interface will require effective ways of communication and collaboration to avoid unwanted conflicts. This project aims to create a user interface for industrial collaborative robot system through integration of current robotic technologies. The robotic system is designed for seamless collaboration with a human in close proximity. The system is capable to communicate with the human via the exchange of gestures, as well as visual signal which operators can observe and comprehend at a glance. The main objective of this PhD is to develop a Human-Robot Interface (HRI) for communication with an industrial collaborative robot during collaboration in proximity. The system is developed in conjunction with a small scale collaborative robot system which has been integrated using off-the-shelf components. The system should be capable of receiving input from the human user via an intuitive method as well as indicating its status to the user ii effectively. The HRI will be developed using a combination of hardware integrations and software developments. The software and the control framework were developed in a way that is applicable to other industrial robots in the future. The developed gesture command system is demonstrated on a heavy duty industrial robot

    State of the art of audio- and video based solutions for AAL

    Get PDF
    Working Group 3. Audio- and Video-based AAL ApplicationsIt is a matter of fact that Europe is facing more and more crucial challenges regarding health and social care due to the demographic change and the current economic context. The recent COVID-19 pandemic has stressed this situation even further, thus highlighting the need for taking action. Active and Assisted Living (AAL) technologies come as a viable approach to help facing these challenges, thanks to the high potential they have in enabling remote care and support. Broadly speaking, AAL can be referred to as the use of innovative and advanced Information and Communication Technologies to create supportive, inclusive and empowering applications and environments that enable older, impaired or frail people to live independently and stay active longer in society. AAL capitalizes on the growing pervasiveness and effectiveness of sensing and computing facilities to supply the persons in need with smart assistance, by responding to their necessities of autonomy, independence, comfort, security and safety. The application scenarios addressed by AAL are complex, due to the inherent heterogeneity of the end-user population, their living arrangements, and their physical conditions or impairment. Despite aiming at diverse goals, AAL systems should share some common characteristics. They are designed to provide support in daily life in an invisible, unobtrusive and user-friendly manner. Moreover, they are conceived to be intelligent, to be able to learn and adapt to the requirements and requests of the assisted people, and to synchronise with their specific needs. Nevertheless, to ensure the uptake of AAL in society, potential users must be willing to use AAL applications and to integrate them in their daily environments and lives. In this respect, video- and audio-based AAL applications have several advantages, in terms of unobtrusiveness and information richness. Indeed, cameras and microphones are far less obtrusive with respect to the hindrance other wearable sensors may cause to one’s activities. In addition, a single camera placed in a room can record most of the activities performed in the room, thus replacing many other non-visual sensors. Currently, video-based applications are effective in recognising and monitoring the activities, the movements, and the overall conditions of the assisted individuals as well as to assess their vital parameters (e.g., heart rate, respiratory rate). Similarly, audio sensors have the potential to become one of the most important modalities for interaction with AAL systems, as they can have a large range of sensing, do not require physical presence at a particular location and are physically intangible. Moreover, relevant information about individuals’ activities and health status can derive from processing audio signals (e.g., speech recordings). Nevertheless, as the other side of the coin, cameras and microphones are often perceived as the most intrusive technologies from the viewpoint of the privacy of the monitored individuals. This is due to the richness of the information these technologies convey and the intimate setting where they may be deployed. Solutions able to ensure privacy preservation by context and by design, as well as to ensure high legal and ethical standards are in high demand. After the review of the current state of play and the discussion in GoodBrother, we may claim that the first solutions in this direction are starting to appear in the literature. A multidisciplinary 4 debate among experts and stakeholders is paving the way towards AAL ensuring ergonomics, usability, acceptance and privacy preservation. The DIANA, PAAL, and VisuAAL projects are examples of this fresh approach. This report provides the reader with a review of the most recent advances in audio- and video-based monitoring technologies for AAL. It has been drafted as a collective effort of WG3 to supply an introduction to AAL, its evolution over time and its main functional and technological underpinnings. In this respect, the report contributes to the field with the outline of a new generation of ethical-aware AAL technologies and a proposal for a novel comprehensive taxonomy of AAL systems and applications. Moreover, the report allows non-technical readers to gather an overview of the main components of an AAL system and how these function and interact with the end-users. The report illustrates the state of the art of the most successful AAL applications and functions based on audio and video data, namely (i) lifelogging and self-monitoring, (ii) remote monitoring of vital signs, (iii) emotional state recognition, (iv) food intake monitoring, activity and behaviour recognition, (v) activity and personal assistance, (vi) gesture recognition, (vii) fall detection and prevention, (viii) mobility assessment and frailty recognition, and (ix) cognitive and motor rehabilitation. For these application scenarios, the report illustrates the state of play in terms of scientific advances, available products and research project. The open challenges are also highlighted. The report ends with an overview of the challenges, the hindrances and the opportunities posed by the uptake in real world settings of AAL technologies. In this respect, the report illustrates the current procedural and technological approaches to cope with acceptability, usability and trust in the AAL technology, by surveying strategies and approaches to co-design, to privacy preservation in video and audio data, to transparency and explainability in data processing, and to data transmission and communication. User acceptance and ethical considerations are also debated. Finally, the potentials coming from the silver economy are overviewed.publishedVersio

    Machine Learning and Virtual Reality on Body MovementsÂż Behaviors to Classify Children with Autism Spectrum Disorder

    Full text link
    [EN] Autism spectrum disorder (ASD) is mostly diagnosed according to behavioral symptoms in sensory, social, and motor domains. Improper motor functioning, during diagnosis, involves the qualitative evaluation of stereotyped and repetitive behaviors, while quantitative methods that classify body movements' frequencies of children with ASD are less addressed. Recent advances in neuroscience, technology, and data analysis techniques are improving the quantitative and ecological validity methods to measure specific functioning in ASD children. On one side, cutting-edge technologies, such as cameras, sensors, and virtual reality can accurately detect and classify behavioral biomarkers, as body movements in real-life simulations. On the other, machine-learning techniques are showing the potential for identifying and classifying patients' subgroups. Starting from these premises, three real-simulated imitation tasks have been implemented in a virtual reality system whose aim is to investigate if machine-learning methods on movement features and frequency could be useful in discriminating ASD children from children with typical neurodevelopment. In this experiment, 24 children with ASD and 25 children with typical neurodevelopment participated in a multimodal virtual reality experience, and changes in their body movements were tracked by a depth sensor camera during the presentation of visual, auditive, and olfactive stimuli. The main results showed that ASD children presented larger body movements than TD children, and that head, trunk, and feet represent the maximum classification with an accuracy of 82.98%. Regarding stimuli, visual condition showed the highest accuracy (89.36%), followed by the visual-auditive stimuli (74.47%), and visual-auditive-olfactory stimuli (70.21%). Finally, the head showed the most consistent performance along with the stimuli, from 80.85% in visual to 89.36% in visual-auditive-olfactory condition. The findings showed the feasibility of applying machine learning and virtual reality to identify body movements' biomarkers that could contribute to improving ASD diagnosis.This work was supported by the Spanish Ministry of Economy, Industry, and Competitiveness funded project "Immersive virtual environment for the evaluation and training of children with autism spectrum disorder: T Room" (IDI-20170912) and by the Generalitat Valenciana funded project REBRAND (PROMETEO/2019/105). Furthermore, this work was co-founded by the European Union through the Operational Program of the European Regional development Fund (ERDF) of the Valencian Community 2014-2020 (IDIFEDER/2018/029).Alcañiz Raya, ML.; Marín-Morales, J.; Minissi, ME.; Teruel Garcia, G.; Abad, L.; Chicchi-Giglioli, IA. (2020). Machine Learning and Virtual Reality on Body Movements¿ Behaviors to Classify Children with Autism Spectrum Disorder. Journal of Clinical Medicine. 9(5):1-20. https://doi.org/10.3390/jcm9051260S12095https://www.who.int/news-room/fact-sheets/detail/autism-spectrum-disordersAnagnostou, E., Zwaigenbaum, L., Szatmari, P., Fombonne, E., Fernandez, B. A., Woodbury-Smith, M., … Scherer, S. W. (2014). Autism spectrum disorder: advances in evidence-based practice. Canadian Medical Association Journal, 186(7), 509-519. doi:10.1503/cmaj.121756Lord, C., Risi, S., DiLavore, P. S., Shulman, C., Thurm, A., & Pickles, A. (2006). Autism From 2 to 9 Years of Age. Archives of General Psychiatry, 63(6), 694. doi:10.1001/archpsyc.63.6.694Schmidt, L., Kirchner, J., Strunz, S., Broźus, J., Ritter, K., Roepke, S., & Dziobek, I. (2015). Psychosocial Functioning and Life Satisfaction in Adults With Autism Spectrum Disorder Without Intellectual Impairment. Journal of Clinical Psychology, 71(12), 1259-1268. doi:10.1002/jclp.22225Turner, M. (1999). Annotation: Repetitive Behaviour in Autism: A Review of Psychological Research. Journal of Child Psychology and Psychiatry, 40(6), 839-849. doi:10.1111/1469-7610.00502Lewis, M. H., & Bodfish, J. W. (1998). Repetitive behavior disorders in autism. Mental Retardation and Developmental Disabilities Research Reviews, 4(2), 80-89. doi:10.1002/(sici)1098-2779(1998)4:23.0.co;2-0Mahone, E. M., Bridges, D., Prahme, C., & Singer, H. S. (2004). Repetitive arm and hand movements (complex motor stereotypies) in children. The Journal of Pediatrics, 145(3), 391-395. doi:10.1016/j.jpeds.2004.06.014MacDonald, R., Green, G., Mansfield, R., Geckeler, A., Gardenier, N., Anderson, J., … Sanchez, J. (2007). Stereotypy in young children with autism and typically developing children. Research in Developmental Disabilities, 28(3), 266-277. doi:10.1016/j.ridd.2006.01.004Singer, H. S. (2009). Motor Stereotypies. Seminars in Pediatric Neurology, 16(2), 77-81. doi:10.1016/j.spen.2009.03.008Lidstone, J., Uljarević, M., Sullivan, J., Rodgers, J., McConachie, H., Freeston, M., … Leekam, S. (2014). Relations among restricted and repetitive behaviors, anxiety and sensory features in children with autism spectrum disorders. Research in Autism Spectrum Disorders, 8(2), 82-92. doi:10.1016/j.rasd.2013.10.001GOLDMAN, S., WANG, C., SALGADO, M. W., GREENE, P. E., KIM, M., & RAPIN, I. (2009). Motor stereotypies in children with autism and other developmental disorders. Developmental Medicine & Child Neurology, 51(1), 30-38. doi:10.1111/j.1469-8749.2008.03178.xLord, C., Rutter, M., & Le Couteur, A. (1994). Autism Diagnostic Interview-Revised: A revised version of a diagnostic interview for caregivers of individuals with possible pervasive developmental disorders. Journal of Autism and Developmental Disorders, 24(5), 659-685. doi:10.1007/bf02172145Volkmar, F. R., State, M., & Klin, A. (2009). Autism and autism spectrum disorders: diagnostic issues for the coming decade. Journal of Child Psychology and Psychiatry, 50(1-2), 108-115. doi:10.1111/j.1469-7610.2008.02010.xReaven, J. A., Hepburn, S. L., & Ross, R. G. (2008). Use of the ADOS and ADI-R in Children with Psychosis: Importance of Clinical Judgment. Clinical Child Psychology and Psychiatry, 13(1), 81-94. doi:10.1177/1359104507086343Torres, E. B., Brincker, M., Isenhower, R. W., Yanovich, P., Stigler, K. A., Nurnberger, J. I., … José, J. V. (2013). Autism: the micro-movement perspective. Frontiers in Integrative Neuroscience, 7. doi:10.3389/fnint.2013.00032Möricke, E., Buitelaar, J. K., & Rommelse, N. N. J. (2015). Do We Need Multiple Informants When Assessing Autistic Traits? The Degree of Report Bias on Offspring, Self, and Spouse Ratings. Journal of Autism and Developmental Disorders, 46(1), 164-175. doi:10.1007/s10803-015-2562-yCHAYTOR, N., SCHMITTEREDGECOMBE, M., & BURR, R. (2006). Improving the ecological validity of executive functioning assessment. Archives of Clinical Neuropsychology, 21(3), 217-227. doi:10.1016/j.acn.2005.12.002Brunswik, E. (1955). Representative design and probabilistic theory in a functional psychology. Psychological Review, 62(3), 193-217. doi:10.1037/h0047470Gillberg, C., & Rasmussen, P. (1994). Brief report: Four case histories and a literature review of williams syndrome and autistic behavior. Journal of Autism and Developmental Disorders, 24(3), 381-393. doi:10.1007/bf02172235Parsons, S. (2016). Authenticity in Virtual Reality for assessment and intervention in autism: A conceptual review. Educational Research Review, 19, 138-157. doi:10.1016/j.edurev.2016.08.001Francis, K. (2005). Autism interventions: a critical update. Developmental Medicine & Child Neurology, 47(7), 493-499. doi:10.1017/s0012162205000952Albinali, F., Goodwin, M. S., & Intille, S. S. (2009). Recognizing stereotypical motor movements in the laboratory and classroom. Proceedings of the 11th international conference on Ubiquitous computing. doi:10.1145/1620545.1620555Pyles, D. A. M., Riordan, M. M., & Bailey, J. S. (1997). The stereotypy analysis: An instrument for examining environmental variables associated with differential rates of stereotypic behavior. Research in Developmental Disabilities, 18(1), 11-38. doi:10.1016/s0891-4222(96)00034-0Nosek, B. A., Hawkins, C. B., & Frazier, R. S. (2011). Implicit social cognition: from measures to mechanisms. Trends in Cognitive Sciences, 15(4), 152-159. doi:10.1016/j.tics.2011.01.005Forscher, P. S., Lai, C. K., Axt, J. R., Ebersole, C. R., Herman, M., Devine, P. G., & Nosek, B. A. (2019). A meta-analysis of procedures to change implicit measures. Journal of Personality and Social Psychology, 117(3), 522-559. doi:10.1037/pspa0000160LeDoux, J. E., & Pine, D. S. (2016). Using Neuroscience to Help Understand Fear and Anxiety: A Two-System Framework. American Journal of Psychiatry, 173(11), 1083-1093. doi:10.1176/appi.ajp.2016.16030353Fenning, R. M., Baker, J. K., Baucom, B. R., Erath, S. A., Howland, M. A., & Moffitt, J. (2017). Electrodermal Variability and Symptom Severity in Children with Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 47(4), 1062-1072. doi:10.1007/s10803-016-3021-0Nikula, R. (1991). Psychological Correlates of Nonspecific Skin Conductance Responses. Psychophysiology, 28(1), 86-90. doi:10.1111/j.1469-8986.1991.tb03392.xAlcañiz Raya, M., Chicchi Giglioli, I. A., Marín-Morales, J., Higuera-Trujillo, J. L., Olmos, E., Minissi, M. E., … Abad, L. (2020). Application of Supervised Machine Learning for Behavioral Biomarkers of Autism Spectrum Disorder Based on Electrodermal Activity and Virtual Reality. Frontiers in Human Neuroscience, 14. doi:10.3389/fnhum.2020.00090Cunningham, W. A., Raye, C. L., & Johnson, M. K. (2004). Implicit and Explicit Evaluation: fMRI Correlates of Valence, Emotional Intensity, and Control in the Processing of Attitudes. Journal of Cognitive Neuroscience, 16(10), 1717-1729. doi:10.1162/0898929042947919Kopton, I. M., & Kenning, P. (2014). Near-infrared spectroscopy (NIRS) as a new tool for neuroeconomic research. Frontiers in Human Neuroscience, 8. doi:10.3389/fnhum.2014.00549Nickel, P., & Nachreiner, F. (2003). Sensitivity and Diagnosticity of the 0.1-Hz Component of Heart Rate Variability as an Indicator of Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 45(4), 575-590. doi:10.1518/hfes.45.4.575.27094Di Martino, A., Yan, C.-G., Li, Q., Denio, E., Castellanos, F. X., Alaerts, K., … Milham, M. P. (2013). The autism brain imaging data exchange: towards a large-scale evaluation of the intrinsic brain architecture in autism. Molecular Psychiatry, 19(6), 659-667. doi:10.1038/mp.2013.78Van Hecke, A. V., Lebow, J., Bal, E., Lamb, D., Harden, E., Kramer, A., … Porges, S. W. (2009). Electroencephalogram and Heart Rate Regulation to Familiar and Unfamiliar People in Children With Autism Spectrum Disorders. Child Development, 80(4), 1118-1133. doi:10.1111/j.1467-8624.2009.01320.xCoronato, A., De Pietro, G., & Paragliola, G. (2014). A situation-aware system for the detection of motion disorders of patients with Autism Spectrum Disorders. Expert Systems with Applications, 41(17), 7868-7877. doi:10.1016/j.eswa.2014.05.011Goodwin, M. S., Intille, S. S., Albinali, F., & Velicer, W. F. (2010). Automated Detection of Stereotypical Motor Movements. Journal of Autism and Developmental Disorders, 41(6), 770-782. doi:10.1007/s10803-010-1102-zRodrigues, J. L., Gonçalves, N., Costa, S., & Soares, F. (2013). Stereotyped movement recognition in children with ASD. Sensors and Actuators A: Physical, 202, 162-169. doi:10.1016/j.sna.2013.04.019Crippa, A., Salvatore, C., Perego, P., Forti, S., Nobile, M., Molteni, M., & Castiglioni, I. (2015). Use of Machine Learning to Identify Children with Autism and Their Motor Abnormalities. Journal of Autism and Developmental Disorders, 45(7), 2146-2156. doi:10.1007/s10803-015-2379-8Wedyan, M., Al-Jumaily, A., & Crippa, A. (2019). Using machine learning to perform early diagnosis of Autism Spectrum Disorder based on simple upper limb movements. International Journal of Hybrid Intelligent Systems, 15(4), 195-206. doi:10.3233/his-190278Parsons, S., Mitchell, P., & Leonard, A. (2004). The Use and Understanding of Virtual Environments by Adolescents with Autistic Spectrum Disorders. Journal of Autism and Developmental Disorders, 34(4), 449-466. doi:10.1023/b:jadd.0000037421.98517.8dParsons, T. D., Rizzo, A. A., Rogers, S., & York, P. (2009). Virtual reality in paediatric rehabilitation: A review. Developmental Neurorehabilitation, 12(4), 224-238. doi:10.1080/17518420902991719Bowman, D. A., Gabbard, J. L., & Hix, D. (2002). A Survey of Usability Evaluation in Virtual Environments: Classification and Comparison of Methods. Presence: Teleoperators and Virtual Environments, 11(4), 404-424. doi:10.1162/105474602760204309Pastorelli, E., & Herrmann, H. (2013). A Small-scale, Low-budget Semi-immersive Virtual Environment for Scientific Visualization and Research. Procedia Computer Science, 25, 14-22. doi:10.1016/j.procs.2013.11.003Cobb, S. V. G., Nichols, S., Ramsey, A., & Wilson, J. R. (1999). Virtual Reality-Induced Symptoms and Effects (VRISE). Presence: Teleoperators and Virtual Environments, 8(2), 169-186. doi:10.1162/105474699566152Wallace, S., Parsons, S., Westbury, A., White, K., White, K., & Bailey, A. (2010). Sense of presence and atypical social judgments in immersive virtual environments. Autism, 14(3), 199-213. doi:10.1177/1362361310363283Lorenzo, G., Lledó, A., Arráez-Vera, G., & Lorenzo-Lledó, A. (2018). The application of immersive virtual reality for students with ASD: A review between 1990–2017. Education and Information Technologies, 24(1), 127-151. doi:10.1007/s10639-018-9766-7Bailenson, J. N., Yee, N., Merget, D., & Schroeder, R. (2006). The Effect of Behavioral Realism and Form Realism of Real-Time Avatar Faces on Verbal Disclosure, Nonverbal Disclosure, Emotion Recognition, and Copresence in Dyadic Interaction. Presence: Teleoperators and Virtual Environments, 15(4), 359-372. doi:10.1162/pres.15.4.359Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Frontiers in Psychology, 9. doi:10.3389/fpsyg.2018.02086Cummings, J. J., & Bailenson, J. N. (2015). How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence. Media Psychology, 19(2), 272-309. doi:10.1080/15213269.2015.1015740Skalski, P., & Tamborini, R. (2007). The Role of Social Presence in Interactive Agent-Based Persuasion. Media Psychology, 10(3), 385-413. doi:10.1080/15213260701533102Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557. doi:10.1098/rstb.2009.0138Baños, R. M., Botella, C., Garcia-Palacios, A., Villa, H., Perpiña, C., & Alcañiz, M. (2000). Presence and Reality Judgment in Virtual Environments: A Unitary Construct? CyberPsychology & Behavior, 3(3), 327-335. doi:10.1089/10949310050078760Bente, G., Rüggenberg, S., Krämer, N. C., & Eschenburg, F. (2008). Avatar-Mediated Networking: Increasing Social Presence and Interpersonal Trust in Net-Based Collaborations. Human Communication Research, 34(2), 287-318. doi:10.1111/j.1468-2958.2008.00322.xHeeter, C. (1992). Being There: The Subjective Experience of Presence. Presence: Teleoperators and Virtual Environments, 1(2), 262-271. doi:10.1162/pres.1992.1.2.262Sanchez-Vives, M. V., & Slater, M. (2005). From presence to consciousness through virtual reality. Nature Reviews Neuroscience, 6(4), 332-339. doi:10.1038/nrn1651Mohr, D. C., Burns, M. N., Schueller, S. M., Clarke, G., & Klinkman, M. (2013). Behavioral Intervention Technologies: Evidence review and recommendations for future research in mental health. General Hospital Psychiatry, 35(4), 332-338. doi:10.1016/j.genhosppsych.2013.03.008Neguț, A., Matu, S.-A., Sava, F. A., & David, D. (2016). Virtual reality measures in neuropsychological assessment: a meta-analytic review. The Clinical Neuropsychologist, 30(2), 165-184. doi:10.1080/13854046.2016.1144793Riva, G. (2005). Virtual Reality in Psychotherapy: Review. CyberPsychology & Behavior, 8(3), 220-230. doi:10.1089/cpb.2005.8.220Valmaggia, L. R., Latif, L., Kempton, M. J., & Rus-Calafell, M. (2016). Virtual reality in the psychological treatment for mental health problems: An systematic review of recent evidence. Psychiatry Research, 236, 189-195. doi:10.1016/j.psychres.2016.01.015Mesa-Gresa, P., Gil-Gómez, H., Lozano-Quilis, J.-A., & Gil-Gómez, J.-A. (2018). Effectiveness of Virtual Reality for Children and Adolescents with Autism Spectrum Disorder: An Evidence-Based Systematic Review. Sensors, 18(8), 2486. doi:10.3390/s18082486Cheng, Y., & Ye, J. (2010). Exploring the social competence of students with autism spectrum conditions in a collaborative virtual learning environment – The pilot study. Computers & Education, 54(4), 1068-1077. doi:10.1016/j.compedu.2009.10.011Jarrold, W., Mundy, P., Gwaltney, M., Bailenson, J., Hatt, N., McIntyre, N., … Swain, L. (2013). Social Attention in a Virtual Public Speaking Task in Higher Functioning Children With Autism. Autism Research, 6(5), 393-410. doi:10.1002/aur.1302Forgeot d’Arc, B., Ramus, F., Lefebvre, A., Brottier, D., Zalla, T., Moukawane, S., … Delorme, R. (2014). Atypical Social Judgment and Sensitivity to Perceptual Cues in Autism Spectrum Disorders. Journal of Autism and Developmental Disorders, 46(5), 1574-1581. doi:10.1007/s10803-014-2208-5Maskey, M., Lowry, J., Rodgers, J., McConachie, H., & Parr, J. R. (2014). Reducing Specific Phobia/Fear in Young People with Autism Spectrum Disorders (ASDs) through a Virtual Reality Environment Intervention. PLoS ONE, 9(7), e100374. doi:10.1371/journal.pone.0100374Baron-Cohen, S., Ashwin, E., Ashwin, C., Tavassoli, T., & Chakrabarti, B. (2009). Talent in autism: hyper-systemizing, hyper-attention to detail and sensory hypersensitivity. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1522), 1377-1383. doi:10.1098/rstb.2008.0337Tomchek, S. D., Huebner, R. A., & Dunn, W. (2014). Patterns of sensory processing in children with an autism spectrum disorder. Research in Autism Spectrum Disorders, 8(9), 1214-1224. doi:10.1016/j.rasd.2014.06.006Boyd, B. A., Baranek, G. T., Sideris, J., Poe, M. D., Watson, L. R., Patten, E., & Miller, H. (2010). Sensory features and repetitive behaviors in children with autism and developmental delays. Autism Research, n/a-n/a. doi:10.1002/aur.124Gabriels, R. L., Agnew, J. A., Miller, L. J., Gralla, J., Pan, Z., Goldson, E., … Hooks, E. (2008). Is there a relationship between restricted, repetitive, stereotyped behaviors and interests and abnormal sensory response in children with autism spectrum disorders? Research in Autism Spectrum Disorders, 2(4), 660-670. doi:10.1016/j.rasd.2008.02.002Cao, Z., Hidalgo, G., Simon, T., Wei, S.-E., & Sheikh, Y. (2021). OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(1), 172-186. doi:10.1109/tpami.2019.2929257Schölkopf, B., Smola, A. J., Williamson, R. C., & Bartlett, P. L. (2000). New Support Vector Algorithms. Neural Computation, 12(5), 1207-1245. doi:10.1162/089976600300015565Yan, K., & Zhang, D. (2015). Feature selection and analysis on correlated gas sensor data with recursive feature elimination. Sensors and Actuators B: Chemical, 212, 353-363. doi:10.1016/j.snb.2015.02.025Chang, C.-C., & Lin, C.-J. (2011). LIBSVM. ACM Transactions on Intelligent Systems and Technology, 2(3), 1-27. doi:10.1145/1961189.1961199O’Neill, M., & Jones, R. S. P. (1997). Journal of Autism and Developmental Disorders, 27(3), 283-293. doi:10.1023/a:1025850431170Foss-Feig, J. H., Kwakye, L. D., Cascio, C. J., Burnette, C. P., Kadivar, H., Stone, W. L., & Wallace, M. T. (2010). An extended multisensory temporal binding window in autism spectrum disorders. Experimental Brain Research, 203(2), 381-389. doi:10.1007/s00221-010-2240-4Courchesne, E., Lincoln, A. J., Kilman, B. A., & Galambos, R. (1985). Event-related brain potential correlates of the processing of novel visual and auditory information in autism. Journal of Autism and Developmental Disorders, 15(1), 55-76. doi:10.1007/bf01837899Russo, N., Foxe, J. J., Brandwein, A. B., Altschuler, T., Gomes, H., & Molholm, S. (2010). Multisensory processing in children with autism: high-density electrical mapping of auditory-somatosensory integration. Autism Research, 3(5), 253-267. doi:10.1002/aur.152Ament, K., Mejia, A., Buhlman, R., Erklin, S., Caffo, B., Mostofsky, S., & Wodka, E. (2014). Evidence for Specificity of Motor Impairments in Catching and Balance in Children with Autism. Journal of Autism and Developmental Disorders, 45(3), 742-751. doi:10.1007/s10803-014-2229-
    • …
    corecore