81 research outputs found
The sound motion controller: a distributed system for interactive music performance
We developed an interactive system for music performance, able to
control sound parameters in a responsive way with respect to the
user’s movements. This system is conceived as a mobile application,
provided with beat tracking and an expressive parameter modulation,
interacting with motion sensors and effector units, which are
connected to a music output, such as synthesizers or sound effects.
We describe the various types of usage of our system and our
achievements, aimed to increase the expression of music
performance and provide an aid to music interaction. The results
obtained outline a first level of integration and foresee future
cognitive and technological research related to it
Overcoming Limitations of the Trackpad for 3D Docking Operations
International audienceFrom notebook trackpads to mobile phones to tabletop surface computing, multitouch input surfaces have become one of the most dominant interfaces for human-computer interaction. Although these are clearly e ective for interaction with 2D graphical user interfaces, we suspect that they are not as well suited for interaction requiring greater degrees of freedom (DoF). Here, we consider the possibility of exploiting two such surfaces, one for each hand, as a means of a ording e cient control over higher dimensional tasks. We investigate performance on a 6 DoF task, comparing such a two-surface multitouch input device against the results obtained using a standard 2D mouse, a single multitouch surface, and a 6 DoF free-space device. Our results indicate that two multitouch surfaces signi cantly improve user performance compared to the mouse and to a single surface
Vibration-induced friction control for walkway locomotion interface
Falls represent a major challenge to mobility for the elderly community, a point that has motivated various studies of balance failures. To support this work, we are interested in mechanisms for the synthesis of ground environments that can be controlled to exhibit dynamic friction characteristics. As a first step, we investigate the design and development of such a variable-friction device, a hybrid locomotion interface using a cable-driven vibrotactile mechanism. Measurements on our prototype, consisting of an aluminum tile covered with low-friction polytetrafluoroethylene (PTFE), demonstrate that it can effectively simulate a low coefficient of static friction. As part of the design, we also investigated the role that induced vibration plays in modifying the coefficient of friction. Measurements of sliding on a PTFE-covered tile in a tilted configuration showed a significant influence of normal low-frequency vibration, particularly for frequencies around 20 Hz, regardless of the user's weight
HAPTIC: Haptic Anatomical Positioning to Improve Clinical Monitoring
Hospitals are inundated by the sounds of patient monitoring devices and alarms. These are meant to help, yet also create a stressful environment for physicians and patients. To address this issue, we consider the possibility of delivering complementary haptic alarm stimuli via a wearable tactile display. This may reduce the necessity for the plethora of audible alarms in the Intensive Care Unit and Operating Room, potentially decreasing fatigue among clinicians, and improving sleep quality for patients. The study described here sought to determine a suitable anatomical location where such a tactile display could be worn. Although the wrist is an obvious default, based on the success of smartwatches and fitness monitors, wearable devices below the elbow are disallowed in aseptic procedural environments. We hypothesized that haptic perception would be approximately equivalent at the wrist and ankle, and confirmed this experimentally. Thus, for a healthcare setting, we suggest that the ankle is a suitable alternative for the placement of a tactile display
Large-scale mobile audio environments for collaborative musical interaction.
ABSTRACT New application spaces and artistic forms can emerge when users are freed from constraints. In the general case of human-computer interfaces, users are often confined to a fixed location, severely limiting mobility. To overcome this constraint in the context of musical interaction, we present a system to manage large-scale collaborative mobile audio environments, driven by user movement. Multiple participants navigate through physical space while sharing overlaid virtual elements. Each user is equipped with a mobile computing device, GPS receiver, orientation sensor, microphone, headphones, or various combinations of these technologies. We investigate methods of location tracking, wireless audio streaming, and state management between mobile devices and centralized servers. The result is a system that allows mobile users, with subjective 3-D audio rendering, to share virtual scenes. The audio elements of these scenes can be organized into large-scale spatial audio interfaces, thus allowing for immersive mobile performance, locative audio installations, and many new forms of collaborative sonic activity
Design of variable-friction devices for shoe-floor contact
In rehabilitation training, high-fidelity simulation environments are needed for reproducing the effects of slippery surfaces, in which potential balance failure conditions can be reproduced on demand. Motivated by these requirements, this article considers the design of variable-friction devices for use in the context of human walking on surfaces in which the coefficient of friction can be controlled dynamically. Various designs are described, aiming at rendering low-friction shoe-floor contact, associated with slippery sur- faces such as ice, as well as higher-friction values more typical of surfaces such as pebbles, sand, or snow. These designs include an array of omnidirectional rolling elements, a combination of low- and high- friction coverings whose contact pressure distribution is controlled, and modulation of low-frequency vi- bration normal to the surface. Our experimentation investigated the static coefficient of friction attainable with each of these designs. Rolling elements were found to be the most slippery, providing a coefficient of friction as low as 0.03, but with significant drawbacks from the perspective of our design objectives. A controlled pressure distribution of low- and high-friction coverings allowed for a minimum coefficient of friction of 0.06. The effects of vibration amplitude and frequency on sliding velocity were also explored. Increases in amplitude resulted in higher velocities, but vibration frequencies greater than 25 Hz reduced sliding velocities. To meet our design objectives, a novel approach involving a friction-variation mecha- nism, embedded in a shoe sole, is proposed
Interacting in shared reality
communication Commercial videoconferencing products have begun to reach a level of quality acceptable for many lowintensity interactions. However, these systems fail to deliver true “high-fidelity ” that serves as a viable alternative to physical co-presence for more demanding interactions. The solution, we believe, lies in the synergy between high bandwidth networks and the application of information technologies that take advantage of such networks. Specifically, computation can be employed to enrich the communication channel, exploiting an awareness of users’ activity in order to better support their needs. In this manner, we are entering an era of communication in which distance need no longer dictate limitations on high quality distributed experiences and interaction.
Did You Feel That? Developing Novel Multimodal Alarms for High Consequence Clinical Environments
Hospitals are overwhelmingly filled with sounds produced by
alarms and patient monitoring devices. Consequently, these
sounds create a fatiguing and stressful environment for both patients
and clinicians. As an attempt to attenuate the auditory sensory
overload, we propose the use of a multimodal alarm system in
operating rooms and intensive care units. Specifically, the system
would utilize multisensory integration of the haptic and auditory
channels. We hypothesize that combining these two channels in a
synchronized fashion, the auditory threshold of perception of participants
will be lowered, thus allowing for an overall reduction of
volume in hospitals. The results obtained from pilot testing support
this hypothesis. We conclude that further investigation of this
method can prove useful in reducing the sound exposure level in
hospitals as well as personalizing the perception and type of the
alarm for clinicians
- …