Intuitive control of synthesis processes is an ongoing challenge within the
domain of auditory perception and cognition. Previous works on sound modelling
combined with psychophysical tests have enabled our team to develop a
synthesizer that provides intuitive control of actions and objects based on
semantic descriptions for sound sources. In this demo we present an augmented
version of the synthesizer in which we added tactile stimulations to increase
the sensation of true continuous friction interactions (rubbing and scratching)
with the simulated objects. This is of interest for several reasons. Firstly,
it enables to evaluate the realism of our sound model in presence of
stimulations from other modalities. Secondly it enables to compare tactile and
auditory signal structures linked to the same evocation, and thirdly it
provides a tool to investigate multimodal perception and how stimulations from
different modalities should be combined to provide realistic user interfaces