25 research outputs found

    Interactive sonification to assist children with autism during motor therapeutic interventions

    Get PDF
    Interactive sonification is an effective tool used to guide individuals when practicing movements. Little research has shown the use of interactive sonification in supporting motor therapeutic interventions for children with autism who exhibit motor impairments. The goal of this research is to study if children with autism understand the use of interactive sonification during motor therapeutic interventions, its potential impact of interactive sonification in the development of motor skills in children with autism, and the feasibility of using it in specialized schools for children with autism. We conducted two deployment studies in Mexico using Go-with-the-Flow, a framework to sonify movements previously developed for chronic pain rehabilitation. In the first study, six children with autism were asked to perform the forward reach and lateral upper-limb exercises while listening to three different sound structures (i.e., one discrete and two continuous sounds). Results showed that children with autism exhibit awareness about the sonification of their movements and engage with the sonification. We then adapted the sonifications based on the results of the first study, for motor therapy of children with autism. In the next study, nine children with autism were asked to perform upper-limb lateral, cross-lateral, and push movements while listening to five different sound structures (i.e., three discrete and two continues) designed to sonify the movements. Results showed that discrete sound structures engage the children in the performance of upper-limb movements and increase their ability to perform the movements correctly. We finally propose design considerations that could guide the design of projects related to interactive sonification

    Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements

    Get PDF
    Existing works on interactive sonification of movements, i.e., the translation of human movement qualities from the physical to the auditory domain, usually adopt a predetermined approach: the way in which movement features modulate the characteristics of sound is fixed. In our work we want to go one step further and demonstrate that the user role can influence the tuning of the mapping between movement cues and sound parameters. Here, we aim to verify if and how the mapping changes when the user is either the performer or the observer of a series of body movements (tracing a square or an infinite shape with the hand in the air). We asked participants to tune movement sonification while they were directly performing the sonified movement vs. while watching another person performing the movement and listening to its sonification. Results show that the tuning of the sonification chosen by participants is influenced by three variables: role of the user (performer vs observer), movement quality (the amount of Smoothness and Directness in the movement), and physical parameters of the movements (velocity and acceleration). Performers focused more on the quality of their movement, while observers focused more on the sonic rendering, making it more expressive and more connected to low-level physical features

    Enhancing the use of Haptic Devices in Education and Entertainment

    Get PDF
    This research was part of the two-years Horizon 2020 European Project "weDRAW". The aim of the project was that "specific sensory systems have specific roles to learn specific concepts". This work explores the use of the haptic modality, stimulated by the means of force-feedback devices, to convey abstract concepts inside virtual reality. After a review of the current use of haptic devices in education, available haptic software and game engines, we focus on the implementation of an haptic plugin for game engines (HPGE, based on state of the art rendering library CHAI3D) and its evaluation in human perception experiments and multisensory integration

    Accessibility of Health Data Representations for Older Adults: Challenges and Opportunities for Design

    Get PDF
    Health data of consumer off-the-shelf wearable devices is often conveyed to users through visual data representations and analyses. However, this is not always accessible to people with disabilities or older people due to low vision, cognitive impairments or literacy issues. Due to trade-offs between aesthetics predominance or information overload, real-time user feedback may not be conveyed easily from sensor devices through visual cues like graphs and texts. These difficulties may hinder critical data understanding. Additional auditory and tactile feedback can also provide immediate and accessible cues from these wearable devices, but it is necessary to understand existing data representation limitations initially. To avoid higher cognitive and visual overload, auditory and haptic cues can be designed to complement, replace or reinforce visual cues. In this paper, we outline the challenges in existing data representation and the necessary evidence to enhance the accessibility of health information from personal sensing devices used to monitor health parameters such as blood pressure, sleep, activity, heart rate and more. By creating innovative and inclusive user feedback, users will likely want to engage and interact with new devices and their own data
    corecore