1,124 research outputs found

    Show me the way to Monte Carlo: density-based trajectory navigation

    Get PDF
    We demonstrate the use of uncertain prediction in a system for pedestrian navigation via audio with a combination of Global Positioning System data, a music player, inertial sensing, magnetic bearing data and Monte Carlo sampling for a density following task, where a listener’s music is modulated according to the changing predictions of user position with respect to a target density, in this case a trajectory or path. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user and demonstrate that the system may be used effectively for varying trajectory width and context

    GpsTunes: controlling navigation via audio feedback

    Get PDF
    We combine the functionality of a mobile Global Positioning System (GPS) with that of an MP3 player, implemented on a PocketPC, to produce a handheld system capable of guiding a user to their desired target location via continuously adapted music feedback. We illustrate how the approach to presentation of the audio display can benefit from insights from control theory, such as predictive 'browsing' elements to the display, and the appropriate representation of uncertainty or ambiguity in the display. The probabilistic interpretation of the navigation task can be generalised to other context-dependent mobile applications. This is the first example of a completely handheld location- aware music player. We discuss scenarios for use of such systems

    Haptics for the development of fundamental rhythm skills, including multi-limb coordination

    Get PDF
    This chapter considers the use of haptics for learning fundamental rhythm skills, including skills that depend on multi-limb coordination. Different sensory modalities have different strengths and weaknesses for the development of skills related to rhythm. For example, vision has low temporal resolution and performs poorly for tracking rhythms in real-time, whereas hearing is highly accurate. However, in the case of multi-limbed rhythms, neither hearing nor sight are particularly well suited to communicating exactly which limb does what and when, or how the limbs coordinate. By contrast, haptics can work especially well in this area, by applying haptic signals independently to each limb. We review relevant theories, including embodied interaction and biological entrainment. We present a range of applications of the Haptic Bracelets, which are computer-controlled wireless vibrotactile devices, one attached to each wrist and ankle. Haptic pulses are used to guide users in playing rhythmic patterns that require multi-limb coordination. One immediate aim of the system is to support the development of practical rhythm skills and multi-limb coordination. A longer-term goal is to aid the development of a wider range of fundamental rhythm skills including recognising, identifying, memorising, retaining, analysing, reproducing, coordinating, modifying and creating rhythms – particularly multi-stream (i.e. polyphonic) rhythmic sequences. Empirical results are presented. We reflect on related work, and discuss design issues for using haptics to support rhythm skills. Skills of this kind are essential not just to drummers and percussionists but also to keyboards players, and more generally to all musicians who need a firm grasp of rhythm

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    It’s a long way to Monte-Carlo: probabilistic display in GPS navigation

    Get PDF
    We present a mobile, GPS-based multimodal navigation system, equipped with inertial control that allows users to explore and navigate through an augmented physical space, incorporating and displaying the uncertainty resulting from inaccurate sensing and unknown user intentions. The system propagates uncertainty appropriately via Monte Carlo sampling and predicts at a user-controllable time horizon. Control of the Monte Carlo exploration is entirely tilt-based. The system output is displayed both visually and in audio. Audio is rendered via granular synthesis to accurately display the probability of the user reaching targets in the space. We also demonstrate the use of uncertain prediction in a trajectory following task, where a section of music is modulated according to the changing predictions of user position with respect to the target trajectory. We show that appropriate display of the full distribution of potential future users positions with respect to sites-of-interest can improve the quality of interaction over a simplistic interpretation of the sensed data

    An Empirical Evaluation On Vibrotactile Feedback For Wristband System

    Full text link
    With the rapid development of mobile computing, wearable wrist-worn is becoming more and more popular. But the current vibrotactile feedback patterns of most wrist-worn devices are too simple to enable effective interaction in nonvisual scenarios. In this paper, we propose the wristband system with four vibrating motors placed in different positions in the wristband, providing multiple vibration patterns to transmit multi-semantic information for users in eyes-free scenarios. However, we just applied five vibrotactile patterns in experiments (positional up and down, horizontal diagonal, clockwise circular, and total vibration) after contrastive analyzing nine patterns in a pilot experiment. The two experiments with the same 12 participants perform the same experimental process in lab and outdoors. According to the experimental results, users can effectively distinguish the five patterns both in lab and outside, with approximately 90% accuracy (except clockwise circular vibration of outside experiment), proving these five vibration patterns can be used to output multi-semantic information. The system can be applied to eyes-free interaction scenarios for wrist-worn devices.Comment: 10 pages

    Multimodal Affective Feedback: Combining Thermal, Vibrotactile, Audio and Visual Signals

    Get PDF
    In this paper we describe a demonstration of our multimodal affective feedback designs, used in research to expand the emotional expressivity of interfaces. The feedback leverages inherent associations and reactions to thermal, vibrotactile, auditory and abstract visual designs to convey a range of affective states without any need for learning feedback encoding. All combinations of the different feedback channels can be utilised, depending on which combination best conveys a given state. All the signals are generated from a mobile phone augmented with thermal and vibrotactile stimulators, which will be available to conference visitors to see, touch, hear and, importantly, feel
    corecore