372 research outputs found

    Fostering Social Interaction Through Sound Feedback: Sentire

    Get PDF
    Sentire is a body–machine interface that sonifies motor behaviour in real time and a participatory, interactive performance in which two people use their physical movements to collaboratively create sound while constantly being influenced by the results. Based on our informal observation that basal social behaviours emerge during Sentire performances, the present article investigates our principal hypothesis that Sentire can foster basic mechanisms underlying non-verbal social interaction. We illustrate how coordination serves as a crucial basic mechanism for social interaction, and consider how it is addressed by various therapeutic approaches, including therapeutic use of real-time auditory feedback. Then we argue that the implementation of Sentire may be fruitful in healthcare contexts and in promoting general well-being. We describe how the Sentire system has been developed further within the scope of the research project ‘Social interaction through sound feedback–Sentire’ that combines human–computer interaction, sound design and real-world research, against the background of the relationship between sound, sociality and therapy. The question concerning how interaction is facilitated through Sentire is addressed through the first results of behavioural analysis using structured observation, which allows for a quasi-quantitative sequential analysis of interactive behaviour.Peer Reviewe

    Can Ubimus Technologies affect our Musicality?

    Get PDF
    Recent works recognize musicality is based on and constrained by our cognitive and biological system. Taking in account a concept from cognitive science - cognitive offloading - as a principle for technology-supported musical activities, in this paper we discuss some principles (guidelines) to be taken into account when designing, developing and evaluating computer music technologies, especially those related to ubimus. We think that Ubimus technology can shape the way we think about music and have a positive (or negative) influence on our musicality

    Contextual Musicality: vocal modulation and its perception in human social interaction

    Get PDF
    Music and language are both deeply rooted in our biology, but scientists have given far more attention to the neurological, biological and evolutionary roots of language than those of music. Because of this, and probably partially due to this, the purpose of music, in evolutionary terms, remains a mystery. Our brain, physiology and psychology make us capable of producing and listening to music since early infancy; therefore, our biology and behaviour are carrying some of the clues that need to be revealed to understand what music is “for”. Furthermore, music and language have a deep relationship, particularly in terms of cognitive processing, that can provide clues about the origins of music. Non-verbal behaviours, including voice characteristics during speech, are an important form of communication that enables individual recognition and assessment of the speaker’s physical characteristics (including sex, femininity/masculinity, body size, physical strength, and attractiveness). Vocal parameters, however, can be intentionally varied, for example altering the intensity (loudness), rhythm and pitch during speech. This is classically demonstrated in infant directed speech (IDS), in which adults alter vocal characteristics such as pitch, cadence and intonation contours when speaking to infants. In this thesis, I analyse vocal modulation and its perception in human social interaction, in different social contexts such as courtship and authority ranking relationships. Results show that specific vocal modulations, akin to those of IDS, and perhaps music, play a role in communicating courtship intent. Based on these results, as well the body of current knowledge, I then propose a model for the evolution of musicality, the human capacity to process musical information, in relation to human vocal communication. I suggest that musicality may not be limited to specifically musical contexts, and can have a role in other domains such as language, which would provide further support for a common origin of language and music. This model supports the hypothesis of a stage in human evolution in which individuals communicated using a music-like protolanguage, a hypothesis first suggested by Darwin

    The impact of the bass drum on human dance movement

    Get PDF
    The present study aims to gain better insight into the connection between music and dance by examining the dynamic effects of the bass drum on a dancing audience in a club-like environment. One hundred adult participants moved freely in groups of five to a musical sequence that comprised six songs. Each song consisted of one section that was repeated three times, each time with a different sound pressure level of the bass drum. Hip and head movements were recorded using motion capture and motion sensing. The study demonstrates that people modify their bodily behavior according to the dynamic level of the bass drum when moving to contemporary dance music in a social context. Participants moved more actively and displayed a higher degree of tempo entrainment as the sound pressure level of the bass drum increased. These results indicate that the prominence of the bass drum in contemporary dance music serves not merely as a stylistic element; indeed, it has a strong influence on dancing itself

    Plasticity: noise, correlation and interaction.

    Get PDF
    This paper introduces the interactive and performative installation artwork, Plasticity (Jane Grant, John Matthias, Nick Ryan and Kin) and its software engine the Neurogranular Sampler via a journey through the synchronized pendulum clocks of Christian Huygens, entrainment in dynamical systems, and correlations and noise within neuronal networks. I examine ways in which the public are 'playing with noise' in the artwork and suggest that the public engagement with the work is closely connected to the fact that the dynamics of the artificial neuronal network lie at the borders between synchrony and randomness

    Do we dance because we walk? The impact of regular vestibular experience on the early development of beat production and perception

    Get PDF
    Movement to music is a universal human behaviour (Savage, Brown, Sakai & Currie, 2015). Whilst the strong link between music and movement is clearly bidirectional, the origins are not clear. Studying the emergence of rhythmic skills through infancy provides a window into the perceptual and physical attributes, experience, and contexts necessary, to attain the basics of human musicality. This thesis asks whether the human experience of bipedal locomotion, as a primary source of regular vestibular information, is crucial for sensorimotor synchronisation (SMS), spontaneous motor tempo (SMT), and impacts rhythm perception. The first experiment evidences the emergence of tempo-flexibility when moving to music between 10- and 18-months-of-age. The following study is the first to show that experience of locomotion, including from infant carrying, predicts the temporal matching of infant movement to music. Curious if carrying practices influence the very rhythms that we naturally produce, a large-scale correlational study finds infant SMT is predicted by parent height, but not infant’s own body size, such that infants with taller caregivers show a slower SMT than those with shorter caregivers. We contend that this reflects infant experience of being carried by their caregiver. The fourth experiment confirms that experience of being carried at a novel tempo can alter the rhythms infant spontaneously produce. Finally, we asked how information from being carried during locomotion might be changing rhythm perception; specifically, if infants show greater activation of their sensorimotor system when hearing rhythms that match the tempo at which they were carried. Combined, these studies present a highly original piece of research into the ways in which early experiences of locomotion may impact fundamental musical skill

    Comparative Study of Beat and Temporal Pattern Perception in a Songbird

    Get PDF
    When humans listen to musical rhythms they sense a beat, the regular pulse that one might tap their foot to. Much about the functions, evolution and neural substrates of beat perception remains unclear. Research has considered whether other species perceive beat, yet more empirical data is needed. Songbirds produce learned rhythmic vocalizations, but can they perceive a beat? To answer this question, I developed a behavioural task that tested whether humans could discriminate rhythms that contained or lacked a beat. I applied an equivalent procedure to test European starlings. I found that humans learned the task with minimal instructions, but starlings were unable to discriminate on the basis of beat presence. Additional testing revealed that the starlings used absolute timing cues and ignored global patterns in rhythms. This work contributes a paradigm that may be adapted to study other species. Its results provide insight for designing future comparative rhythm experiments
    • 

    corecore