Increasingly, our mobile devices are acquiring the ability to be aware of their surroundings, both their physical environments and their digital neighbourhoods. Alongside this awareness of the outside world, these devices are acquiring the ability to sense what is happening to them-how they are being held and moved. The coincidence of connectedness, awareness and richly multimodal input and output capabilities brings into the hand a device capable of supporting an entirely new class of haptic or touch-based interactions, where gestures can be captured and reactions to these gestures conveyed as haptic feedback directly into the hand. Thus one can literally shake the hand of a friend, toss a file off ones PDA, or be lead by the hand to a desired location in a strange city. While this new interaction paradigm opens up a vast array of potential application domains, it also poses a number of challenges. In particular, how can such devices support interactions that will have consequences in environments with different spatial frames of reference – the world-centred frame of reference of the location-aware application, the bodycentred frame of reference of the gestural interface, and the device-centred frame of reference of a screen-based application. This paper presents some prototype applications for handheld devices that explore the implications of different frames of reference for actions in the mobile context
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.