6 research outputs found

    Identifying game design factor for the development of educational digital games

    Get PDF
    Designing and developing educational digital games are complicated processes.In both cases, multidisciplinary teams are required. Moreover, designing a diverse range of educational digital games from start to end can be different in each case. This paper introduces a concise model for designing digital games with an aim to assist and support both educators and game designers. In addition, this model will bridge the gap between educators and game designers, so as to achieve the common goal of developing effective educational digital games.This study intends to investigate and conduct a comprehensive review of the main features and factors for developing educational digital games suitable for different educational contents and platforms.Furthermore, the proposed model is believed to bring benefits to both educators and game designers who are involved in the game design process for digital game-based learnin

    Is Your Virtual Self as Sensational as Your Real? Virtual Reality: The Effect of Body Consciousness on the Experience of Exercise Sensations

    Get PDF
    Objectives: Past research has shown that Virtual Reality (VR) is an effective method for reducing the perception of pain and effort associated with exercise. As pain and effort are subjective feelings, they are influenced by a variety of psychological factors, including one’s awareness of internal body sensations, known as Private Body Consciousness (PBC). The goal of the present study was to investigate whether the effectiveness of VR in reducing the feeling of exercise pain and effort is moderated by PBC. Design and Methods: Eighty participants were recruited to this study and were randomly assigned to a VR or a non-VR control group. All participants were required to maintain a 20% 1RM isometric bicep curl, whilst reporting ratings of pain intensity and perception of effort. Participants in the VR group completed the isometric bicep curl task whilst wearing a VR device which simulated an exercising environment. Participants in the non-VR group completed a conventional isometric bicep curl exercise without VR. Participants’ heart rate was continuously monitored along with time to exhaustion. A questionnaire was used to assess PBC. Results: Participants in the VR group reported significantly lower pain and effort and exhibited longer time to exhaustion compared to the non-VR group. Notably, PBC had no effect on these measures and did not interact with the VR manipulation. Conclusions: Results verified that VR during exercise could reduce negative sensations associated with exercise regardless of the levels of PBC

    Exploring the Touch and Motion Features in Game-Based Cognitive Assessments

    Get PDF
    Early detection of cognitive decline is important for timely intervention and treatment strategies to prevent further dete- rioration or development of more severe cognitive impairment, as well as identify at risk individuals for research. In this paper, we explore the feasibility of using data collected from built-in sensors of mobile phone and gameplay performance in mobile-game-based cognitive assessments. Twenty-two healthy participants took part in the two-session experiment where they were asked to take a series of standard cognitive assessments followed by playing three popular mobile games in which user-game interaction data were passively collected. The results from bivariate analysis reveal correlations between our proposed features and scores obtained from paper-based cognitive assessments. Our results show that touch gestural interaction and device motion patterns can be used as supplementary features on mobile game-based cognitive measurement. This study provides initial evidence that game related metrics on existing off-the-shelf games have potential to be used as proxies for conventional cognitive measures, specifically for visuospatial function, visual search capability, mental flexibility, memory and attention

    Soft, wireless periocular wearable electronics for real-time detection of eye vergence in a virtual reality toward mobile eye therapies

    Get PDF
    Ocular disorders are currently affecting the developed world, causing loss of productivity in adults and children. While the cause of such disorders is not clear, neurological issues are often considered as the biggest possibility. Treatment of strabismus and vergence requires an invasive surgery or clinic-based vision therapy that has been used for decades due to the lack of alternatives such as portable therapeutic tools. Recent advancement in electronic packaging and image processing techniques have opened the possibility for optics-based portable eye tracking approaches, but several technical and safety hurdles limit the implementation of the technology in wearable applications. Here, we introduce a fully wearable, wireless soft electronic system that offers a portable, highly sensitive tracking of eye movements (vergence) via the combination of skin-conformal sensors and a virtual reality system. Advancement of material processing and printing technologies based on aerosol jet printing enables reliable manufacturing of skin-like sensors, while a flexible electronic circuit is prepared by the integration of chip components onto a soft elastomeric membrane. Analytical and computational study of a data classification algorithm provides a highly accurate tool for real-time detection and classification of ocular motions. In vivo demonstration with 14 human subjects captures the potential of the wearable electronics as a portable therapy system, which can be easily synchronized with a virtual reality headset

    Investigation of Mobile Games for Cognitive Assessment and Screening with a Focus on Touch-based and Motion Features

    Get PDF
    Early detection of cognitive decline is important for timely intervention and treatment strategies to prevent further deterioration or development of more severe forms of cognitive dysfunction. Therefore, many tests have been developed for screening and monitoring changes in cognitive status. However, these existing assessment and screening tools are not designed for self-administration without a trained examiner. Moreover, the lack of multiple variations of these paper-based measures and repeated exposure to such tests could reduce their sensitivity to detect cognitive changes due to practice effects. These limitations pose clinical challenges to early identification of cognitive deficits and monitoring of longitudinal changes in cognitive function, especially in resource-limited settings. To this end, a number of studies have adopted mobile technology and gamification to facilitate remote and self-administered cognitive assessment and screening in a less effortful and engaging manner. Despite this, existing literature has so far only examined the feasibility of using gameplay performance as a means for cognitive assessment. There has not been any attempt to explore gameplay behaviours as revealed through patterns of touch interactions and device motions as indicative features for cognitive evaluation. Therefore the aim of this thesis is to investigate the use of touch and motions features in game-based cognitive assessment and screening. This is achieved through two studies. The first study was carried out to examine the links between cognitive abilities and underlying patterns of user-game interaction with a focus on touch gestures and device motions. Twenty-two healthy participants took part in the two-session experiment where they were asked to take a series of standard cognitive assessments followed by playing three casual mobile games in which user-game interaction data were passively collected. The results from bivariate analysis indicated that increases in swipe length and swipe speed, in the game context, were significantly correlated with declines in response inhibition ability but increased performance on attention. However, it remained unclear whether the device motion features alone could be used to identify cognitive ability as the results provide only weak evidence for relationships between cognitive performance and the underlying device motion patterns while playing the games. In the second study, we evaluated the potential use of these behavioural features and mobile games as a potential screening tool for clinical conditions with cognitive impairment. Alcohol-related brain damage (ARBD) is often found to be associated with deficits in multiple cognitive functions in patients with alcohol dependence, which is the focus of this thesis. Based on findings from the preliminary study, the second experimental study was carried out to investigate the feasibility of using such user-game interaction patterns on mobile games to develop an automated screening tool for alcohol-dependent patients. The classification performance of various supervised learning algorithms was evaluated on data collected from 40 patients and 40 age-matched healthy adults. The results showed that patients with alcohol dependence could be automatically identified accurately using the ensemble of touch, device motion, and gameplay performance features on 3-minute samples (accuracy=0.95, sensitivity=0.95, and specificity=0.95). The findings provide evidence suggesting the potential use of user-game interaction metrics on existing mobile games as discriminant features for developing an implicit measure to identify alcohol dependence conditions. In addition to supporting healthcare professionals in clinical decision-making, the game-based method could be used as a novel strategy to promote self-screening, especially outside of clinical settings. The findings from this thesis were also applied to guidelines to aid researchers in the game interaction design to capitalise on the use of touch and device motion features with regard to cognitive assessment and screening

    VREED: Virtual Reality Emotion Recognition Dataset using Eye Tracking & Physiological Measures

    No full text
    The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which emotions were triggered using immersive 360° Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1-3 min each) selected through focus groups and a pilot trial. Statistical analysis confirmed the validity of the selected 360-VEs in eliciting the desired emotions. Preliminary machine learning analysis was carried out, demonstrating state-of-the-art performance reported in affective computing literature using non-immersive modalities. VREED is among the first multimodal VR datasets in emotion recognition using behavioural and physiological signals. VREED is made publicly available on Kaggle 1. We hope that this contribution encourages other researchers to utilise VREED further to understand emotional responses in VR and ultimately enhance VR experiences design in applications where emotional elicitation plays a key role, i.e. healthcare, gaming, education, etc