15,345 research outputs found
Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
We present rhythmic micro-gestures, micro-movements of the hand that are repeated in time with a rhythm. We present a user study that investigated how well users can perform rhythmic micro-gestures and if they can use them eyes-free with non-visual feedback. We found that users could successfully use our interaction technique (97% success rate across all gestures) with short interaction times, rating them as low difficulty as well. Simple audio cues that only convey the rhythm outperformed animations showing the hand movements, supporting rhythmic micro-gestures as an eyes-free input technique
Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems
When users want to interact with an in-air gesture system, they
must first address it. This involves finding where to gesture
so that their actions can be sensed, and how to direct their
input towards that system so that they do not also affect others
or cause unwanted effects. This is an important problem [6]
which lacks a practical solution. We present an interaction
technique which uses multimodal feedback to help users address
in-air gesture systems. The feedback tells them how
(“do that”) and where (“there”) to gesture, using light, audio
and tactile displays. By doing that there, users can direct their
input to the system they wish to interact with, in a place where
their gestures can be sensed. We discuss the design of our
technique and three experiments investigating its use, finding
that users can “do that” well (93.2%–99.9%) while accurately
(51mm–80mm) and quickly (3.7s) finding “there”
Recommended from our members
A Tablet-Based Assessment of Rhythmic Ability.
The exponential rise in use of mobile consumer electronics has presented a great potential for research to be conducted remotely, with participants numbering several orders of magnitude greater than a typical research paradigm. Here, we attempt to demonstrate the validity and reliability of using a consumer game-engine to create software presented on a mobile tablet to assess sensorimotor synchronization, a proxy of rhythmic ability. Our goal was to ascertain whether previously observed research results can be replicated, rather than assess whether a mobile tablet achieves comparable performance to a desktop computer. To achieve this, younger (aged 18-35 years) and older (aged 60-80 years) adult musicians and non-musicians were recruited to play a custom-designed sensorimotor synchronization assessment on a mobile tablet in a controlled laboratory environment. To assess reliability, participants performed the assessment twice, separated by a week, and an intra-class correlation coefficient (ICC) was calculated. Results supported the validity of this approach to assessing rhythmic abilities by replicating previously observed results. Specifically, musicians performed better than non-musicians, and younger adults performed better than older adults. Participants also performed best when the tempo was in the range of previously-identified preferred tempos, when the stimuli included both audio and visual information, and when synchronizing on-beat compared to off-beat or continuation (self-paced) synchronization. Additionally, high ICC values (>0.75) suggested excellent test-retest reliability. Together, these results support the notion that consumer electronics running software built with a game engine may serve as a valuable resource for remote, mobile-based data collection of rhythmic abilities
Wearable Haptic Devices for Gait Re-education by Rhythmic Haptic Cueing
This research explores the development and evaluation of wearable haptic devices for gait sensing and rhythmic haptic cueing in the context of gait re-education for people with neurological and neurodegenerative conditions. Many people with long-term neurological and neurodegenerative conditions such as Stroke, Brain Injury, Multiple Sclerosis or Parkinson’s disease suffer from impaired walking gait pattern. Gait improvement can lead to better fluidity in walking, improved health outcomes, greater independence, and enhanced quality of life. Existing lab-based studies with wearable devices have shown that rhythmic haptic cueing can cause immediate improvements to gait features such as temporal symmetry, stride length, and walking speed. However, current wearable systems are unsuitable for self-managed use for in-the-wild applications with people having such conditions. This work aims to investigate the research question of how wearable haptic devices can help in long-term gait re-education using rhythmic haptic cueing. A longitudinal pilot study has been conducted with a brain trauma survivor, providing rhythmic haptic cueing using a wearable haptic device as a therapeutic intervention for a two-week period. Preliminary results comparing pre and post-intervention gait measurements have shown improvements in walking speed, temporal asymmetry, and stride length. The pilot study has raised an array of issues that require further study. This work aims to develop and evaluate prototype systems through an iterative design process to make possible the self-managed use of such devices in-the-wild. These systems will directly provide therapeutic intervention for gait re-education, offer enhanced information for therapists, remotely monitor dosage adherence and inform treatment and prognoses over the long-term. This research will evaluate the use of technology from the perspective of multiple stakeholders, including clinicians, carers and patients. This work has the potential to impact clinical practice nationwide and worldwide in neuro-physiotherapy
Assessing Inconspicuous Smartphone Authentication for Blind People
As people store more personal data in their smartphones, the consequences of
having it stolen or lost become an increasing concern. A typical
counter-measure to avoid this risk is to set up a secret code that has to be
entered to unlock the device after a period of inactivity. However, for blind
users, PINs and passwords are inadequate, since entry 1) consumes a non-trivial
amount of time, e.g. using screen readers, 2) is susceptible to observation,
where nearby people can see or hear the secret code, and 3) might collide with
social norms, e.g. disrupting personal interactions. Tap-based authentication
methods have been presented and allow unlocking to be performed in a short time
and support naturally occurring inconspicuous behavior (e.g. concealing the
device inside a jacket) by being usable with a single hand. This paper presents
a study with blind users (N = 16) where an authentication method based on tap
phrases is evaluated. Results showed the method to be usable and to support the
desired inconspicuity.Comment: 4 pages, 1 figur
RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction
RoboJam is a machine-learning system for generating music that assists users
of a touchscreen music app by performing responses to their short
improvisations. This system uses a recurrent artificial neural network to
generate sequences of touchscreen interactions and absolute timings, rather
than high-level musical notes. To accomplish this, RoboJam's network uses a
mixture density layer to predict appropriate touch interaction locations in
space and time. In this paper, we describe the design and implementation of
RoboJam's network and how it has been integrated into a touchscreen music app.
A preliminary evaluation analyses the system in terms of training, musical
generation and user interaction
Haptics for the development of fundamental rhythm skills, including multi-limb coordination
This chapter considers the use of haptics for learning fundamental rhythm skills, including skills that depend on multi-limb coordination. Different sensory modalities have different strengths and weaknesses for the development of skills related to rhythm. For example, vision has low temporal resolution and performs poorly for tracking rhythms in real-time, whereas hearing is highly accurate. However, in the case of multi-limbed rhythms, neither hearing nor sight are particularly well suited to communicating exactly which limb does what and when, or how the limbs coordinate. By contrast, haptics can work especially well in this area, by applying haptic signals independently to each limb. We review relevant theories, including embodied interaction and biological entrainment. We present a range of applications of the Haptic Bracelets, which are computer-controlled wireless vibrotactile devices, one attached to each wrist and ankle. Haptic pulses are used to guide users in playing rhythmic patterns that require multi-limb coordination. One immediate aim of the system is to support the development of practical rhythm skills and multi-limb coordination. A longer-term goal is to aid the development of a wider range of fundamental rhythm skills including recognising, identifying, memorising, retaining, analysing, reproducing, coordinating, modifying and creating rhythms – particularly multi-stream (i.e. polyphonic) rhythmic sequences. Empirical results are presented. We reflect on related work, and discuss design issues for using haptics to support rhythm skills. Skills of this kind are essential not just to drummers and percussionists but also to keyboards players, and more generally to all musicians who need a firm grasp of rhythm
Crossmodal spatial location: initial experiments
This paper describes an alternative form of interaction for mobile devices using crossmodal output. The aim of our work is to investigate the equivalence of audio and tactile displays so that the same messages can be presented in one form or another. Initial experiments show that spatial location can be perceived as equivalent in both the auditory and tactile modalities Results show that participants are able to map presented 3D audio positions to tactile body positions on the waist most effectively when mobile and that there are significantly more errors made when using the ankle or wrist. This paper compares the results from both a static and mobile experiment on crossmodal spatial location and outlines the most effective ways to use this crossmodal output in a mobile context
The sound motion controller: a distributed system for interactive music performance
We developed an interactive system for music performance, able to
control sound parameters in a responsive way with respect to the
user’s movements. This system is conceived as a mobile application,
provided with beat tracking and an expressive parameter modulation,
interacting with motion sensors and effector units, which are
connected to a music output, such as synthesizers or sound effects.
We describe the various types of usage of our system and our
achievements, aimed to increase the expression of music
performance and provide an aid to music interaction. The results
obtained outline a first level of integration and foresee future
cognitive and technological research related to it
- …