Sound Navigation System Based on Kansei Interaction

Abstract

We developed a sound navigation system that can interact with movement using Kansei behavioral information. This system is based on unconscious human behavior, such as putting hands over ears while listening to something carefully. We collected candid videos of people to watch their behavior. Observing unconscious behavior is important for developing an idea of the flow of a tangible interactive system because data collected from observation can be applied to an emotion-based interface to control the system. The sound navigation system was successfully developed into a sound scope headphone to focus on the sound of a target instrument in an orchestra or jazz band. Furthermore, the target sound can be changed from one instrument to another by turning your head in the perceived direction of the target instrument. The headphones were equipped with three sensors: a digital compass to detect head position (when turning left and right), an acceleration sensor (when looking up and down), and a bend sensor for emphasizing the target sound when hands are put on ears. We found the users, which ranged from young children to elderly people, successfully controlled the headphones and were satisfied with the easy and novel interaction between their movements and the sound

    Similar works