31 research outputs found

    c

    Get PDF
    In this article, we describe and interpret a set of acoustic and linguistic features that characterise emotional/emotion-related user states – confined to the one database processed: four classes in a German corpus of children interacting with a pet robot. To this end, we collected a very large feature vector consisting of more than 4000 features extracted at different sites. We performed extensive feature selection (Sequential Forward Floating Search) for seven acoustic and four linguistic types of features, ending up in a small number of ‘most important ’ features which we try to interpret by discussing the impact of different feature and extraction types. We establish different measures of impact and discuss the mutual influence of acoustics and linguistics

    Real-Time Sonification of Physiological Data in an Artistic Performance Context

    Get PDF
    Presented at the 14th International Conference on Auditory Display (ICAD2008) on June 24-27, 2008 in Paris, France.This paper presents an approach for real-time sonification of physiological measurements and its extension to artistic creation. Three sensors where used to measure heart pulse, breathing, and thoracic volume expansion. A different sound process based on sound synthesis and digital audio effects was used for each sensor. We designed the system in order to produce three different streams clearly separables and to allow listeners to perceive as clearly as possible the physiological phenomena. The data were measured in the context of an artistic performance. Because the first purpose of this sonification is to participate to an artistic project we tried to produce an interesting sound results from an aesthetic point of view, but at the same time we tried to keep an auditory display highly correlated to the data flows
    corecore