294 research outputs found
Levitating Particle Displays with Interactive Voxels
Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research
Levitate: Interaction with Floating Particle Displays
This demonstration showcases the current state of the art for the levitating particle display from the Levitate Project. In this demonstration, we show a new type of display consisting of floating voxels, small levitating particles that can be positioned and moved independently in 3D space. Phased ultrasound arrays are used to acoustically levitate the particles. Users can interact directly with each particle using pointing gestures. This allows users to walk-up and interact without any user instrumentation, creating an exciting opportunity to deploy these tangible displays in public spaces in the future. This demonstration explores the design potential of floating voxels and how these may be used to create new types of user interfaces
Floating Widgets: Interaction with Acoustically-Levitated Widgets
Acoustic levitation enables new types of human-computer interface, where the content that users interact with is made up from small objects held in mid-air. We show that acoustically-levitated objects can form mid-air widgets that respond to interaction. Users can interact with them using in-air hand gestures. Sound and widget movement are used as feedback about the interaction
LeviSense: a platform for the multisensory integration in levitating food and insights into its effect on flavour perception
Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved, even if we are not aware of it. However, while multisensory integration has been well studied in psychology, there is not a single platform for testing systematically the effects of different stimuli. This lack of platform results in unresolved design challenges for the design of taste-based immersive experiences. Here, we present LeviSense: the first system designed for multisensory integration in gustatory experiences based on levitated food. Our system enables the systematic exploration of different sensory effects on eating experiences. It also opens up new opportunities for other professionals (e.g., molecular gastronomy chefs) looking for innovative taste-delivery platforms. We describe the design process behind LeviSense and conduct two experiments to test a subset of the crossmodal combinations (i.e., taste and vision, taste and smell). Our results show how different lighting and smell conditions affect the perceived taste intensity, pleasantness, and satisfaction. We discuss how LeviSense creates a new technical, creative, and expressive possibilities in a series of emerging design spaces within Human-Food Interaction
DataLev: Mid-air Data Physicalisation Using Acoustic Levitation
Data physicalisation is a technique that encodes data through the
geometric and material properties of an artefact, allowing users
to engage with data in a more immersive and multi-sensory way.
However, current methods of data physicalisation are limited in
terms of their reconfgurability and the types of materials that can
be used. Acoustophoresis—a method of suspending and manipulating materials using sound waves—ofers a promising solution
to these challenges. In this paper, we present DataLev, a design
space and platform for creating reconfgurable, multimodal data
physicalisations with enriched materiality using acoustophoresis.
We demonstrate the capabilities of DataLev through eight examples and evaluate its performance in terms of reconfgurability and
materiality. Our work ofers a new approach to data physicalisation, enabling designers to create more dynamic, engaging, and
expressive artefacts
Enhancing Physical Objects with Actuated Levitating Particles
We describe a novel display concept where levitating particles are used to add a dynamic display element to static physical objects. The particles are actuated using ultrasound, for expressive output without mechanical constraints. We explore novel ways of using particles to add dynamic output to other objects, for new interactive experiences. We also discuss the practical challenges of combining these. This work shows how the unique capabilities of levitation can create novel displays by enhancing another form of media
SoundBender: dynamic acoustic control behind obstacles
Ultrasound manipulation is growing in popularity in the HCI community with applications in haptics, on-body interaction, and levitation-based displays. Most of these applications share two key limitations: a) the complexity of the sound fields that can be produced is limited by the physical size of the transducers; and b) no obstacles can be present between the transducers and the control point. We present SoundBender, a hybrid system that overcomes these limitations by combining the versatility of phased arrays of transducers (PATs) with the precision of acoustic metamaterials. In this paper, we explain our approach to design and implement such hybrid modulators (i.e. to create complex sound fields) and methods to manipulate the field dynamically (i.e. stretch, steer). We demonstrate our concept using self-bending beams enabling both levitation and tactile feedback around an obstacle and present example applications enabled by SoundBender
Point-and-shake: selecting from levitating object displays
Acoustic levitation enables a radical new type of humancomputer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Point-and- Shake is an effective way of initiating interaction with levitating object displays
Levitation Simulator: Prototyping Ultrasonic Levitation Interfaces in Virtual Reality
We present the Levitation Simulator, a system that enables researchers and
designers to iteratively develop and prototype levitation interface ideas in
Virtual Reality. This includes user tests and formal experiments. We derive a
model of the movement of a levitating particle in such an interface. Based on
this, we develop an interactive simulation of the levitation interface in VR,
which exhibits the dynamical properties of the real interface. The results of a
Fitts' Law pointing study show that the Levitation Simulator enables
performance, comparable to the real prototype. We developed the first two
interactive games, dedicated for levitation interfaces: LeviShooter and
BeadBounce, in the Levitation Simulator, and then implemented them on the real
interface. Our results indicate that participants experienced similar levels of
user engagement when playing the games, in the two environments. We share our
Levitation Simulator as Open Source, thereby democratizing levitation research,
without the need for a levitation apparatus.Comment: 12 pages, 14 figures, CHI'2
- …