research

Sound for enhanced experiences in mobile applications

Abstract

When visiting new places you want information about restaurants, shopping, places of historic in- terest etc. Smartphones are perfect tools for de- livering such location-based information, but the risk is that users get absorbed by texts, maps, videos etc. on the device screen and get a second- hand experience of the environment they are vis- iting rather than the sought-after first-hand expe- rience. One problem is that the users’ eyes often are directed to the device screen, rather than to the surrounding environment. Another problem is that interpreting more or less abstract informa- tion on maps, texts, images etc. may take up sig- nificant shares of the users’ overall cognitive re- sources. The work presented here tried to overcome these two problems by studying design for human-computer interaction based on the users’ everyday abilities such as directional hearing and point and sweep gestures. Today’s smartphones know where you are, in what direction you are pointing the device and they have systems for ren- dering spatial audio. These readily available tech- nologies hold the potential to make information more easy to interpret and use, demand less cog- nitive resources and free the users from having to look more or less constantly on a device screen

    Similar works