4 research outputs found

    Sonification and Music, Music and Sonification

    Get PDF
    Despite it being more than twenty years since the launch of an international conference series dedicated to its study, there is still much debate over what sonification really is, and especially as regards its relationship to music. A layman’s definition of sonification might be that it is the use of non-speech audio to communicate data, the aural counterpart to visualization. Many researchers have claimed musicality for their sonifications, generally when using data-to-pitch mappings. In 2006 Bennett Hogg and I (Vickers and Hogg 2006) made a rather provocative assertion that bound music and sonification together (q.v., and further developed in Vickers (2006)), not so much to claim an ontological truth but to foreground a debate that has simmered since the first International Conference on Auditory Display (ICAD) in 1992. Since then there has been an increasing number of musical and sonic art compositions driven by the data of natural phenomena, some of which are claimed by their authors to be sonifications. This chapter looks at some of the issues surrounding the relationship between sonification and music and at developments that have the potential to draw sonification and the sonic arts into closer union

    Semefulness: A social semiotics of touch

    Full text link
    This paper explores the multiple significances (semefulness) of touch, as experienced by us as embodied subjects. Prompted by the development of a range of touch-based technologies, I consider the current writings about touch in a range of fields and how these have contributed to contemporary understandings of the meanings of touch. I then explore a number of these meanings - connection, engagement, contiguity, differentiation, positioning - for their contribution to our understanding of the world and of our own embodied subjectivity. I also explore the deployment of these meanings by contemporary technologies. © 2011 Taylor & Francis

    Lemma 4: Haptic Input + Auditory Display = Musical Instrument?

    No full text
    In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroacoustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised
    corecore