69 research outputs found

    Music Emotion Recognition: Intention of Composers-Performers Versus Perception of Musicians, Non-Musicians, and Listening Machines

    Get PDF

    “Give me happy pop songs in C major and with a fast tempo”: A vocal assistant for content-based queries to online music repositories

    Get PDF
    This paper presents an Internet of Musical Things system devised to support recreational music-making, improvisation, composition, and music learning via vocal queries to an online music repository. The system involves a commercial voice-based interface and the Jamendo cloud-based repository of Creative Commons music content. Thanks to the system the user can query the Jamendo music repository by six content-based features and each combination thereof: mood, genre, tempo, chords, key and tuning. Such queries differ from the conventional methods for music retrieval, which are based on the piece's title and the artist's name. These features were identified following a survey with 112 musicians, which preliminary validated the concept underlying the proposed system. A user study with 20 musicians showed that the system was deemed usable, able to provide a satisfactory user experience, and useful in a variety of musical activities. Differences in the participants’ needs were identified, which highlighted the need for personalization mechanisms based on the expertise level of the user. Importantly, the system was seen as a concrete solution to physical encumbrances that arise from the concurrent use of the instrument and devices providing interactive media resources. Finally, the system offers benefits to visually-impaired musicians

    Musical Haptic Wearables for Synchronisation of Visually-impaired Performers: a Co-design Approach

    Get PDF
    The emergence of new technologies is providing opportunities to develop novel solutions that facilitate the integration of visually-impaired people in different activities of our daily life, including collective music making. This paper presents a study conducted with visually-impaired music performers, which involved a participatory approach to the design of accessible technologies for musical communication in group playing. We report on three workshops that were conducted together with members of an established ensemble of solely visually-impaired musicians. The first workshop focused on the identification of the participants’ needs during the activity of playing in groups and how technology could satisfy such needs. The second and third workshops investigated, respectively, the activities of choir singing and instrument playing in ensemble, focusing on the key issue of synchronisation that was identified in the first workshop. The workshops involved prototypes of musical haptic wearables, which were co-designed and evaluated by the participants. Overall, results indicate that wireless tactile communication represents a promising avenue to cater effectively to the needs of visually-impaired performers

    Co-design of a Smart Cajon

    Get PDF
    The work of Luca Turchet is supported by a Marie-Curie Individual fellowship of European Union’s Horizon 2020 research and innovation program, under grant agreement No. 749561. Mathieu Barthet also acknowledges support from the EU H2020 Audio Commons grant (688382)

    Co-Design of Musical Haptic Wearables for Electronic Music Performer's Communication

    Get PDF

    To hear or not to hear: Sound Availability Modulates Sensory-Motor Integration

    Get PDF
    When we walk in place with our eyes closed after a few minutes of walking on a treadmill, we experience an unintentional forward body displacement (drift), called the sensory-motor aftereffect. Initially, this effect was thought to be due to the mismatch experienced during treadmill walking between the visual (absence of optic flow signaling body steadiness) and proprioceptive (muscle spindles firing signaling body displacement) information. Recently, the persistence of this effect has been shown even in the absence of vision, suggesting that other information, such as the sound of steps, could play a role. To test this hypothesis, six cochlear-implanted individuals were recruited and their forward drift was measured before (Control phase) and after (Post Exercise phase) walking on a treadmill while having their cochlear system turned on and turned off. The relevance in testing cochlear-implanted individuals was that when their system is turned off, they perceive total silence, even eliminating the sounds normally obtained from bone conduction. Results showed the absence of the aftereffect when the system was turned off, underlining the fundamental role played by sounds in the control of action and breaking new ground in the use of interactive sound feedback in motor learning and motor development

    Embodied Interactions with E-Textiles and the Internet of Sounds for Performing Arts

    Get PDF
    This paper presents initial steps towards the design of an embedded system for body-centric sonic performance. The proposed prototyping system allows performers to manipulate sounds through gestural interactions captured by textile wearable sensors. The e-textile sensor data control, in real-time, audio synthesis algorithms working with content from Audio Commons, a novel web-based ecosystem for repurposing crowd-sourced audio. The system enables creative embodied music interactions by combining seamless physical e-textiles with web-based digital audio technologies

    Internet of Musical things: Visit and Challenges

    Get PDF

    Bom uso do e-mail corporativo.

    Get PDF
    bitstream/item/48520/1/Bom-uso-corporativo-do-e-mail-1.pd
    • …
    corecore