393 research outputs found

    Aplicação para smartphone ’Practice As You Walk’: mobile learning e gamification em ensaio coral

    Get PDF
    With the worldwide massification of mobile devices, the use of technology for pedagogical purposes in the context of music learning has proven to be an indispensable tool for ensuring motivation among students. By exploring the implementation of concepts such as gamification and mobile learning in music education, mentioning relevant case studies in this field, this dissertation culminates with the development of an application for Android smartphones entitled ’Practice As You Walk’. As the name implies, this learning tool consists in the reproduction of musical excerpts at the user’s walking pace, being a form of musical practice that stimulates the memorization of music pieces and synchronization ability of the individual. For the development of the application, this work explores innovative methods used for step detection through integrated sensors on mobile devices, such as the accelerometer and the gyroscope, and also presents the fundamentals of the MIDI communication protocol for the digital transmission of events related with musical performance. Two methods for smartphone-based step detection are proposed, with the rulebased method attaining an F1-score of 99% and the machine learning method attaining an F1-score of 95.84%. The development of the application, initially in the Unity platform, consists of integrating classes for MIDI file manipulation and processing with the ability to interpret and reproduce them at the user’s walking pace. Due to some faults identified in the music playback mechanism, migration to the Android Studio IDE took place through a third-party library that integrates the Sonivox EAS synthesizer. This abstraction from the playback mechanism allowed direct incorporation of the core functionalities developed in Unity and focus on the construction of a captivating user interface. Finally, within the pedagogical purpose of the present work, the application was tested by members of a children and youth choir. The questionnaire revealed general satisfaction with the application, allowing collection of opinions and suggestions on potential future improvements.Com a massificação global dos dispositivos móveis, o uso da tecnologia para fins pedagógicos no contexto da aprendizagem musical revelou ser uma ferramenta imprescindível para assegurar a motivação dos estudantes. Explorando a implementação dos conceitos de gamification e mobile learning no ensino da música, referindo casos de estudo relevantes neste campo, esta dissertação culmina com o desenvolvimento de uma aplicação para smartphones Android denominada ’Practice As You Walk’. Conforme indica o nome, esta ferramenta de aprendizagem consiste na reprodução de excertos musicais ao ritmo determinado pelo passo do utilizador, sendo uma forma de prática musical que estimula no indivíduo a memorização de obras musicais e a capacidade de sincronização. Com vista ao desenvolvimento da aplicação, são explorados neste trabalho métodos inovadores utilizados na deteção de passo através de sensores incorporados nos dispositivos móveis, tais como o acelerómetro e o giroscópio, e também apresentados os fundamentos do protocolo de comunicação MIDI para a transmissão digital de eventos relacionados com a interpretação musical. São propostos dois métodos para a deteção de passo com recurso a um smartphone, tendo o método baseado em regras atingido um F1-score de 99% e o método baseado em aprendizagem automática um F1-score de 95.84%. O desenvolvimento da aplicação, inicialmente na plataforma Unity, consiste na integração de classes para a manipulação e processamento de ficheiros MIDI com a capacidade de leitura e reprodução dos mesmos ao ritmo do passo do utilizador. Devido a alguns defeitos identificados no mecanismo de reprodução, segue-se a migração para o ambiente de desenvolvimento Android Studio recorrendo a uma biblioteca que integra o sintetizador Sonivox EAS. Esta abstração do mecanismo de reprodução permite a integração direta das funcionalidades desenvolvidas em Unity e um maior foco na construção de uma interface de utilizador cativante. Por fim, no âmbito pedagógico deste trabalho, a aplicação foi testada por membros de um coro infanto-juvenil. Um questionário revelou satisfação geral com a aplicação e permitiu a recolha de opiniões e sugestões tendo em vista potenciais melhorias.Mestrado em Engenharia Eletrónica e Telecomunicaçõe

    Ashitaka: an audiovisual instrument

    Get PDF
    This thesis looks at how sound and visuals may be linked in a musical instrument, with a view to creating such an instrument. Though it appears to be an area of significant interest, at the time of writing there is very little existing - written, or theoretical - research available in this domain. Therefore, based on Michel Chion’s notion of synchresis in film, the concept of a fused, inseparable audiovisual material is presented. The thesis then looks at how such a material may be created and manipulated in a performance situation. A software environment named Heilan was developed in order to provide a base for experimenting with different approaches to the creation of audiovisual instruments. The software and a number of experimental instruments are discussed prior to a discussion and evaluation of the final ‘Ashitaka’ instrument. This instrument represents the culmination of the work carried out for this thesis, and is intended as a first step in identifying the issues and complications involved in the creation of such an instrument

    Body Motion Capture Using Multiple Inertial Sensors

    Get PDF
    Near-fall detection is important for medical research since it can help doctors diagnose fall-related diseases and also help alert both doctors and patients of possible falls. However, in people’s daily life, there are lots of similarities between near-falls and other Activities of Daily Living (ADLs), which makes near-falls particularly difficult to detect. In order to find the subtle difference between ADLs and near-fall and accurately identify the latter, the movement of whole human body needs to be captured and displayed by a computer generated avatar. In this thesis, a wireless inertial motion capture system consisting of a central control host and ten sensor nodes is used to capture human body movements. Each of the ten sensor nodes in the system has a tri-axis accelerometer and a tri-axis gyroscope. They are attached to separate locations of a human body to record both angular and acceleration data with which body movements can be captured by applying Euler angle based algorithms, specifically, single rotation order algorithm and the optimal rotation order algorithm. According to the experiment results of capturing ten ADLs, both the single rotation order algorithm and the optimal rotation order algorithm can track normal human body movements without significantly distortion and the latter shows higher accuracy and lower data shifting. Compared to previous inertial systems with magnetometers, this system reduces hardware complexity and software computation while ensures a reasonable accuracy in capturing human body movements

    Wearable Setup for Gesture and Motion based Closed Loop Audio-Haptic Interaction

    Get PDF
    Großhauser T, Hermann T. Wearable Setup for Gesture and Motion based Closed Loop Audio-Haptic Interaction. In: Brazil E, ed. Proceedings of the 16th International Community for Auditory Display. Washington, USA: International Community for Auditory Display; 2010: 31-38.The wearable sensor and feedback system presented in this paper is a type of audio-haptic display which contains on board sensors, embedded sound synthesis, external sensors, and on the feedback side a loudspeaker and several vibrating motors. The so-called "embedded sonification" in this case here is an onboard IC with implemented sound synthesis. This is adjusted directly by the user and/or controlled in real-time by the sensors, which are on the board or fixed on the human body and connected to the board via cable or radio frequency transmission. Direct audio out and tactile feedback closes the loop between the wearable board and the user. In many situations, this setup can serve as a complement to visual output, e.g. exploring data in 3D-space or learning motion and gestures in dance, sports or outdoor and every-day activities. A new metaphor for interactive acoustical augmentation is introduced, the so called "audio loupe". In this case it means the sonification of minimal movements or state changes, which can sometimes hardly be perceived visually or corporal. These are for example small jitters or deviations of predefined ideal gestures or movements. Our system is easy to use, it even allows operation without an external computer. We demonstrate and outline the benefits of our wearable interactive setup in highly skilled motion learning scenarios in dance and sports

    The Effects of Explicit and Implicit Interaction on User Experiences in a Mixed Reality Installation: The Synthetic Oracle

    Get PDF
    Virtual and mixed reality environments (VMRE) often imply full-body human-computer interaction scenarios. We used a public multimodal mixed reality installation, the Synthetic Oracle, and a between-groups design to study the effects of implicit (e.g., passively walking) or explicit (e.g., pointing) interaction modes on the users' emotional and engagement experiences, and we assessed it using questionnaires. Additionally, real-time arm motion data was used to categorize the user behavior and to provide interaction possibilities for the explicit interaction group. The results show that the online behavior classification corresponded well to the users' interaction mode. In addition, contrary to the explicit interaction, the engagement ratings from implicit users were positively correlated with a valence but were uncorrelated with arousal ratings. Interestingly, arousal levels were correlated with different behaviors displayed by the visitors depending on the interaction mode. Hence, this study confirms that the activity level and behavior of users modulates their experience, and that in turn, the interaction mode modulates their behavior. Thus, these results show the importance of the selected interaction mode when designing users' experiences in VMRE
    corecore