2,024 research outputs found

    Tracking Visible Features of Speech for Computer-Based Speech Therapy for Childhood Apraxia of Speech

    Get PDF
    At present, there are few, if any, effective computer-based speech therapy systems (CBSTs) that support the at-home component for clinical interventions for Childhood Apraxia of Speech (CAS). PROMPT, an established speech therapy intervention for CAS, has the potential to be supported via a CBST, which could increase engagement and provide valuable feedback to the child. However, the necessary computational techniques have not yet been developed and evaluated. In this thesis, I will describe the development of some of the key underlying computational components that are required for the development of such a system. These components concern camera-based tracking of visible features of speech which concern jaw kinematics. These components would also be necessary for the serious game that we have envisioned

    Learning analytics and psychophysiology : understanding the learning process in a STEM game

    Get PDF
    This study focuses on the exploration of player experience in educational games and its potential impact on predicting learning outcomes. Specifically, the research aims to investigate the connection psychophysiology data, obtained through a summative study involving nine participants, and the results of a learning analytics model derived from a larger field test. The study incorporates eye tracking and electrodermal activity data to gain insights into the predictive power of this data. Through the analysis of player experience data, the study sheds light on the factors that contribute to effective educational game design. By examining the eye tracking and EDA data, the researchers explored the participants' engagement levels, attention patterns, and emotional arousal during gameplay. These findings revealed a connection between spikes of visual attention and EDA during interactions with character faces as well as in game cinematics. In conclusion, the outcomes of this study provide valuable insights for future educational game designers. By understanding the relationship between user experience indicators and learning analytics, designers can tailor game elements to enhance engagement, attention, and emotional arousal, ultimately leading to improved learning outcomes. The integration of eye tracking and EDA data in user experience studies adds a new dimension to the evaluation and design of educational games. The findings pave the way for future research in the field and highlight the importance of considering user experience as a crucial factor in educational game design and development.Includes bibliographical references

    The platformer experience dataset

    Get PDF
    Player modeling and estimation of player experience have become very active research fields within affective computing, human computer interaction, and game artificial intelligence in recent years. For advancing our knowledge and understanding on player experience this paper introduces the Platformer Experience Dataset (PED) - the first open-access game experience corpus - that contains multiple modalities of user data of Super Mario Bros players. The open-access database aims to be used for player experience capture through context-based (i.e. game content), behavioral and visual recordings of platform game players. In addition, the database contains demographical data of the players and self-reported annotations of experience in two forms: ratings and ranks. PED opens up the way to desktop and console games that use video from webcameras and visual sensors and offer possibilities for holistic player experience modeling approaches that can, in turn, yield richer game personalization.peer-reviewe

    Biometric analysis of behaviours in serious games

    Get PDF
    Dissertação de mestrado em Engenharia InformáticaThis document is a Master’s Dissertation, within the scope of data analysis for visualisation and exploration of knowledge in the health area. The thesis herein described is part of the second year of the Master’s in Informatics Engineering, and it was held at the University of Minho in Braga, Portugal. The main objective of this project is to combine the data measuring the performance of a patient when he is playing a specific game with the data collected at the same time from some biometric sensors so that it is possible to apply data analysis algorithms in order to discover new relationships between the data. It is also intended that, through adequate visualisation, knowledge can be created. Data collection was carried out in partnership with the Centro Neurosensorial de Braga. For data collection, initially, it was used an emotion recognition activity, a test that measures the processing speed (quick naming) and two tests to better characterise the child in terms of memory and attention capacity. Initially, a small analysis was made of the data that were extracted through the platform provided by the Centro Neurosensorial de Braga. After a mass data collection, its analysis was carried out. It was possible to verify, analytically, that as the memory deficit increases, among others, the number of fixations, the number of regressions, the time taken to perform the rapid naming test increases. Regarding the emotions expressed during the rapid naming test, it was possible to verify that respondents who expressed happiness during the test show a better memory capacity, while children who expressed emotions of surprise or sadness are subject to memory deficit.Este documento é uma Dissertação de Mestrado, no âmbito da análise de dados para visualização e exploração do conhecimento na área da saúde. A tese aqui descrita insere-se no segundo ano do Mestrado em Engenharia Informática e foi realizada na Universidade do Minho em Braga, Portugal. O principal objetivo deste projeto é combinar os dados que medem o desempenho de um paciente durante a realização um jogo específico com os dados recolhidos, ao mesmo tempo, de alguns sensores biométricos, para que seja possível aplicar algoritmos de análise de dados a fim de descobrir novas relações entre os dados. Pretende-se também que, através de uma visualização adequada, o conhecimento possa ser criado. A recolha de dados foi realizada em parceria com o Centro Neurosensorial de Braga. Para a recolha de dados, inicialmente, opta-se por usar uma atividade de reconhecimento de emoções, uma prova que mede a velocidade de processamento (nomeação rápida) e duas provas para caracterizar melhor a criança quanto à sua capacidade de memória e de atenção. Inicialmente, foi feito uma pequena análise dos dados que eram extraídos através da plataforma disponibilizada pelo Centro Neurosensorial de Braga. Depois de uma recolha de dados em massa, foi feita a sua análise. Foi possível verificar, analiticamente, que à medida que aumenta o défice de memória aumenta, entre outros, o número de fixações, o número de regressões, o tempo de realização da prova da nomeação rápida. Em relação às emoções expressas durante a prova de nomeação rápida, foi possível verificar que os respondentes que expressaram felicidade durante a prova mostram uma melhor capacidade de memória, enquanto as crianças que expressaram emoção de surpresa ou tristeza têm tendência a ter uma menor capacidade de memória.I would also like to thank GADI - Gabinete de Avaliação Diagnóstico e Intervenção of Vila Nova de Famalicão, Colégio Menino Deus, and Agrupamento de Escolas de Vila Cova of Barcelos Vertical Schools Group, for having authorised the collection of data in schools at such an atypical time. Thanks, too, to all parents who allowed their children to participate in this project

    The platformer experience dataset

    Full text link

    Description and application of the correlation between gaze and hand for the different hand events occurring during interaction with tablets

    Get PDF
    People’s activities naturally involve the coordination of gaze and hand. Research in Human-Computer Interaction (HCI) endeavours to enable users to exploit this multimodality for enhanced interaction. With the abundance of touch screen devices, direct manipulation of an interface has become a dominating interaction technique. Although touch enabled devices are prolific in both public and private spaces, interactions with these devices do not fully utilise the benefits from the correlation between gaze and hand. Touch enabled devices do not employ the richness of the continuous manual activity above their display surface for interaction and a lot of information expressed by users through their hand movements is ignored. This thesis aims at investigating the correlation between gaze and hand during natural interaction with touch enabled devices to address these issues. To do so, we set three objectives. Firstly, we seek to describe the correlation between gaze and hand in order to understand how they operate together: what is the spatial and temporal relationship between these modalities when users interact with touch enabled devices? Secondly, we want to know the role of some of the inherent factors brought by the interaction with touch enabled devices on the correlation between gaze and hand, because identifying what modulates the correlation is crucial to design more efficient applications: what are the impacts of the individual differences, the task characteristics and the features of the on-screen targets? Thirdly, as we want to see whether additional information related to the user can be extracted from the correlation between gaze and hand, we investigate the latter for the detection of users’ cognitive state while they interact with touch enabled devices: can the correlation reveal the users’ hesitation? To meet the objectives, we devised two data collections for gaze and hand. In the first data collection, we cover the manual interaction on-screen. In the second data collection, we focus instead on the manual interaction in-the-air. We dissect the correlation between gaze and hand using three common hand events users perform while interacting with touch enabled devices. These events comprise taps, stationary hand events and the motion between taps and stationary hand events. We use a tablet as a touch enabled device because of its medium size and the ease to integrate both eye and hand tracking sensors. We study the correlation between gaze and hand for tap events by collecting gaze estimation data and taps on tablet in the context of Internet related tasks, representative of typical activities executed using tablets. The correlation is described in the spatial and temporal dimensions. Individual differences and effects of the task nature and target type are also investigated. To study the correlation between gaze and hand when the hand is in a stationary situation, we conducted a data collection in the context of a Memory Game, chosen to generate enough cognitive load during playing while requiring the hand to leave the tablet’s surface. We introduce and evaluate three detection algorithms, inspired by eye tracking, based on the analogy between gaze and hand patterns. Afterwards, spatial comparisons between gaze and hands are analysed to describe the correlation. We study the effects on the task difficulty and how the hesitation of the participants influences the correlation. Since there is no certain way of knowing when a participant hesitates, we approximate the hesitation with the failure of matching a pair of already seen tiles. We study the correlation between gaze and hand during hand motion between taps and stationary hand events from the same data collection context than the case mentioned above. We first align gaze and hand data in time and report the correlation coefficients in both X and Y axis. After considering the general case, we examine the impact of the different factors implicated in the context: participants, task difficulty, duration and type of the hand motion. Our results show that the correlation between gaze and hand, throughout the interaction, is stronger in the horizontal dimension of the tablet rather than in its vertical dimension, and that it varies widely across users, especially spatially. We also confirm the eyes lead the hand for target acquisition. Moreover, we find out that the correlation between gaze and hand when the hand is in the air above the tablet’s surface depends on where the users look at on the tablet. As well, we show that the correlation during eye and hand during stationary hand events can indicate the users’ indecision, and that while the hand is moving, the correlation depends on different factors, such as the degree of difficulty of the task performed on the tablet and the nature of the event before/after the motion

    A low cost virtual reality interface for educational games

    Get PDF
    Mobile virtual reality has the potential to improve learning experiences by making them more immersive and engaging for students. This type of virtual reality also aims to be more cost effective by using a smartphone to drive the virtual reality experience. One issue with mobile virtual reality is that the screen (i.e. main interface) of the smartphone is occluded by the virtual reality headset. To investigate solutions to this issue, this project details the development and testing of a computer vision based controller that aims to have a cheaper per unit cost when compared to a conventional electronic controller by making use of 3D printing and the built-in camera of a smartphone. Reducing the cost per unit is useful for educational contexts as solutions would need to scale to classrooms sizes. The research question for this project is thus, “can a computer vision based virtual reality controller provide comparable immersion to a conventional electronic controller”. It was found that a computer vision based controller can provide comparable immersion, though it is more challenging to use. This challenge was found to contribute more towards engagement as it did not diminish the performance of users in terms of question scores

    Opportunistic Uses of the Traditional School Day Through Student Examination of Fitbit Activity Tracker Data

    Get PDF
    In large part due to the highly prescribed nature of the typical school day for children, efforts to design new interactions with technology have often focused on less-structured after-school clubs and other out-of-school environments. We argue that while the school day imposes serious restrictions, school routines can and should be opportunistically leveraged by designers and by youth. Specifically, wearable activity tracking devices open some new avenues for opportunistic collection of and reflection on data from the school day. To demonstrate this, we present two cases from an elementary statistics classroom unit we designed that intentionally integrated wearable activity trackers and childcreated data visualizations. The first case involves a group of students comparing favored recess activities to determine which was more physically demanding. The second case is of a student who took advantage of her knowledge of teachers’ school day routines to test the reliability of a Fitbit activity tracker against a commercial mobile app
    • …
    corecore