14 research outputs found

    A proposed framework of an interactive semi-virtual environment for enhanced education of children with autism spectrum disorders

    Get PDF
    Education of people with special needs has recently been considered as a key element in the field of medical education. Recent development in the area of information and communication technologies may enable development of collaborative interactive environments which facilitate early stage education and provide specialists with robust tools indicating the person's autism spectrum disorder level. Towards the goal of establishing an enhanced learning environment for children with autism this paper attempts to provide a framework of a semi-controlled real-world environment used for the daily education of an autistic person according to the scenarios selected by the specialists. The proposed framework employs both real-world objects and virtual environments equipped with humanoids able to provide emotional feedback and to demonstrate empathy. Potential examples and usage scenarios for such environments are also described

    Medical Students\u27 Experiences and Outcomes Using a Virtual Human Simulation to Improve Communication Skills: Mixed Methods Study

    Get PDF
    Background: Attending to the wide range of communication behaviors that convey empathy is an important but often underemphasized concept to reduce errors in care, improve patient satisfaction, and improve cancer patient outcomes. A virtual human (VH)–based simulation, MPathic-VR, was developed to train health care providers in empathic communication with patients and in interprofessional settings and evaluated through a randomized controlled trial. Objective: This mixed methods study aimed to investigate the differential effects of a VH-based simulation developed to train health care providers in empathic patient-provider and interprofessional communication. Methods: We employed a mixed methods intervention design, involving a comparison of 2 quantitative measures—MPathic-VR–calculated scores and the objective structured clinical exam (OSCE) scores—with qualitative reflections by medical students about their experiences. This paper is a secondary, focused analysis of intervention arm data from the larger trial. Students at 3 medical schools in the United States (n=206) received simulation to improve empathic communication skills. We conducted analysis of variance, thematic text analysis, and merging mixed methods analysis. Results: OSCE scores were significantly improved for learners in the intervention group (mean 0.806, SD 0.201) compared with the control group (mean 0.752, SD 0.198; F1,414=6.09; P=.01). Qualitative analysis revealed 3 major positive themes for the MPathic-VR group learners: gaining useful communication skills, learning awareness of nonverbal skills in addition to verbal skills, and feeling motivated to learn more about communication. Finally, the results of the mixed methods analysis indicated that most of the variation between high, middle, and lower performers was noted about nonverbal behaviors. Medium and high OSCE scorers most often commented on the importance of nonverbal communication. Themes of motivation to learn about communication were only present in middle and high scorers. Conclusions: VHs are a promising strategy for improving empathic communication in health care. Higher performers seemed most engaged to learn, particularly nonverbal skills

    Anonymous Panda: preserving anonymity and expressiveness in online mental health platforms

    Get PDF
    Digital solutions that allow people to seek treatment, such as online psychological interventions and other technology-mediated therapies, have been developed to assist individuals with mental health disorders. Such approaches may raise privacy concerns about the use of people’s data and the safety of their mental health information. This work uses cutting-edge computer graphics technology to develop a novel system capable of increasing anonymity while maintaining expressiveness in computer-mediated mental health interventions. According to our preliminary findings, we were able to customize a realistic avatar using Live Link, Metahumans, and Unreal Engine 4 (UE4) with the same emotional depth as a real person. Furthermore, these findings showed that the virtual avatars’ inability to express themselves through hand motion gave the impression that they were acting in an unnatural way. By including the hand tracking feature using the Leap Motion Controller, we were able to improve our comprehension of the prospective use of ultra-realistic virtual human avatars in video conferencing therapy, i.e., both studies helped us understand how vital facial and body expressions are and how problematic their absence is in communicating with others.Soluções digitais que permitem às pessoas procurar tratamento, tais como terapias psicológicas online e outras terapias com recurso à tecnologia, foram desenvolvidas para ajudar indivíduos com distúrbios de saúde mental. Tais abordagens podem suscitar preocupações sobre a privacidade na utilização dos dados das pessoas e a segurança da informação sobre a sua saúde mental. Este trabalho utiliza tecnologia de ponta em computação gráfica para desenvolver um sistema inovador capaz de aumentar o anonimato, mantendo simultaneamente a expressividade nas inter venções de saúde mental mediadas por computador. Segundo os nossos resultados preliminares, conseguimos personalizar um avatar realista usando Live Link, Metahumans, e Unreal Engine 4 (UE4) com a mesma profundidade emocional que uma pessoa real. Além disso, os resultados mostraram que a incapacidade dos avatares virtuais de se expressarem através do movimento das mãos deu a impressão de que estavam a agir de uma forma pouco natural. Ao incluir a função de rastreio das mãos utilizando o Leap Motion Controller, conseguimos melhorar a nossa compreensão do uso prospetivo de avatares humanos virtuais e ultrarrealistas na terapia de videoconferência, ou seja, os estudos realizados ajudaram-nos a compreender como as expressões faciais e corporais são vitais e como a sua ausência é problemática na comunicação com os outros

    Depression detection using virtual avatar communication and eye tracking system

    Get PDF
    Globally, depression is one of the most common mental health issues. Therefore, finding an effective way to detect mental health problems is an important subject for study in human-machine interactions. In order to examine the potential in using a virtual avatar communication and eye-tracking system to identify people as being with or without depression symptoms, this study has devised three research aims; 1) to understand the effect of different types of interviewers on eye gaze patterns, 2) to clarify the effect of neutral conversation topics on eye gaze, and 3) to compare eye gaze patterns between people with or without depression. Twenty-seven participants - fifteen in the control group and twelve in the depression symptoms group - were involved in this study and they were asked to talk to both a virtual avatar and human interviewers. Gaze patterns were recorded by an eye-tracking device during both types of interaction. The experiment results indicated significant differences in eye movements between the control group and depression symptoms group. Moreover, the identified differences were more pronounced when people with in depressed symptoms group were talking about neutral conversation topics rather than negative topics

    Sensorimotor Oscillations During a Reciprocal Touch Paradigm With a Human or Robot Partner

    Get PDF
    Robots provide an opportunity to extend research on the cognitive, perceptual, and neural processes involved in social interaction. This study examined how sensorimotor oscillatory electroencephalogram (EEG) activity can be influenced by the perceived nature of a task partner – human or robot – during a novel “reciprocal touch” paradigm. Twenty adult participants viewed a demonstration of a robot that could “feel” tactile stimulation through a haptic sensor on its hand and “see” changes in light through a photoreceptor at the level of the eyes; the robot responded to touch or changes in light by moving a contralateral digit. During EEG collection, participants engaged in a joint task that involved sending tactile stimulation to a partner (robot or human) and receiving tactile stimulation back. Tactile stimulation sent by the participant was initiated by a button press and was delivered 1500 ms later via an inflatable membrane on the hand of the human or on the haptic sensor of the robot partner. Stimulation to the participant’s finger (from the partner) was sent on a fixed schedule, regardless of partner type. We analyzed activity of the sensorimotor mu rhythm during anticipation of tactile stimulation to the right hand, comparing mu activity at central electrode sites when participants believed that tactile stimulation was initiated by a robot or a human, and to trials in which “nobody” received stimulation. There was a significant difference in contralateral mu rhythm activity between anticipating stimulation from a human partner and the “nobody” condition. This effect was less pronounced for anticipation of stimulation from the robot partner. Analyses also examined beta rhythm responses to the execution of the button press, comparing oscillatory activity when participants sent tactile stimulation to the robot or the human partner. The extent of beta rebound at frontocentral electrode sites following the button press differed between conditions, with a significantly larger increase in beta power when participants sent tactile stimulation to a robot partner compared to the human partner. This increase in beta power may reflect greater predictably in event outcomes. This new paradigm and the novel findings advance the neuroscientific study of human–robot interaction

    A multimodal dialog approach to mental state characterization in clinically depressed, anxious, and suicidal populations

    Get PDF
    BackgroundThe rise of depression, anxiety, and suicide rates has led to increased demand for telemedicine-based mental health screening and remote patient monitoring (RPM) solutions to alleviate the burden on, and enhance the efficiency of, mental health practitioners. Multimodal dialog systems (MDS) that conduct on-demand, structured interviews offer a scalable and cost-effective solution to address this need.ObjectiveThis study evaluates the feasibility of a cloud based MDS agent, Tina, for mental state characterization in participants with depression, anxiety, and suicide risk.MethodSixty-eight participants were recruited through an online health registry and completed 73 sessions, with 15 (20.6%), 21 (28.8%), and 26 (35.6%) sessions screening positive for depression, anxiety, and suicide risk, respectively using conventional screening instruments. Participants then interacted with Tina as they completed a structured interview designed to elicit calibrated, open-ended responses regarding the participants' feelings and emotional state. Simultaneously, the platform streamed their speech and video recordings in real-time to a HIPAA-compliant cloud server, to compute speech, language, and facial movement-based biomarkers. After their sessions, participants completed user experience surveys. Machine learning models were developed using extracted features and evaluated with the area under the receiver operating characteristic curve (AUC).ResultsFor both depression and suicide risk, affected individuals tended to have a higher percent pause time, while those positive for anxiety showed reduced lip movement relative to healthy controls. In terms of single-modality classification models, speech features performed best for depression (AUC = 0.64; 95% CI = 0.51–0.78), facial features for anxiety (AUC = 0.57; 95% CI = 0.43–0.71), and text features for suicide risk (AUC = 0.65; 95% CI = 0.52–0.78). Best overall performance was achieved by decision fusion of all models in identifying suicide risk (AUC = 0.76; 95% CI = 0.65–0.87). Participants reported the experience comfortable and shared their feelings.ConclusionMDS is a feasible, useful, effective, and interpretable solution for RPM in real-world clinical depression, anxiety, and suicidal populations. Facial information is more informative for anxiety classification, while speech and language are more discriminative of depression and suicidality markers. In general, combining speech, language, and facial information improved model performance on all classification tasks
    corecore