7,167 research outputs found
Fingers of a Hand Oscillate Together: Phase Syncronisation of Tremor in Hover Touch Sensing
When using non-contact finger tracking, fingers can be classified
as to which hand they belong to by analysing the phase
relation of physiological tremor. In this paper, we show how
3D capacitive sensors can pick up muscle tremor in fingers
above a device. We develop a signal processing pipeline
based on nonlinear phase synchronisation that can reliably
group fingers to hands and experimentally validate our technique.
This allows significant new gestural capabilities for
3D finger sensing without additional hardware
PickCells: A Physically Reconfigurable Cell-composed Touchscreen
Touchscreens are the predominant medium for interactions with digital services; however, their current fixed form factor narrows the scope for rich physical interactions by limiting interaction possibilities to a single, planar surface. In this paper we introduce the concept of PickCells, a fully reconfigurable device concept composed of cells, that breaks the mould of rigid screens and explores a modular system that affords rich sets of tangible interactions and novel acrossdevice relationships. Through a series of co-design activities â involving HCI experts and potential end-users of such systems â we synthesised a design space aimed at inspiring future research, giving researchers and designers a framework in which to explore modular screen interactions. The design space we propose unifies existing works on modular touch surfaces under a general framework and broadens horizons by opening up unexplored spaces providing new interaction possibilities. In this paper, we present the PickCells concept, a design space of modular touch surfaces, and propose a toolkit for quick scenario prototyping
Ambient Gestures
We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous âin the environmentâ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing
Spatial Interaction for Immersive Mixed-Reality Visualizations
Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics.
Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis.
Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis.
Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research.
One of the resulting challenges, however, is the design of user interaction for these often complex systems.
In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions:
1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them?
2) How does spatial interaction benefit these visualizations and how should such interactions be designed?
3) How can spatial interaction in these immersive environments be analyzed and evaluated?
To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts.
For the second question, I study how spatial interaction in particular can help to explore data in mixed reality.
There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels.
Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights.
Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, fĂŒhren zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse.
Insbesondere bei inhĂ€rent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natĂŒrliche Interaktionstechniken als hilfreich fĂŒr die Datenanalyse erwiesen.
DarĂŒber hinaus spielt in solchen AnwendungsfĂ€llen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext fĂŒr die Analyse dient.
Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung gefĂŒhrt hat.
Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion fĂŒr diese oft komplexen Systeme.
In meiner Dissertation beschĂ€ftige ich mich mit dieser Herausforderung, indem ich die Interaktion fĂŒr immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche:
1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden?
2) Wie profitieren diese Visualisierungen von rÀumlicher Interaktion und wie sollten solche Interaktionen gestaltet werden?
3) Wie kann rÀumliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden?
Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen fĂŒr immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren.
FĂŒr die zweite Frage untersuche ich, wie insbesondere die rĂ€umliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann.
Dabei betrachte ich die Interaktion mit rÀumlichen GerÀten im Vergleich zur Touch-Eingabe, die Verwendung zusÀtzlicher mobiler GerÀte als Controller und das Potenzial transparenter Interaktionspanels.
Um die dritte Frage zu beantworten, stelle ich schlieĂlich meine Forschung darĂŒber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann.
Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie rĂ€umliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstĂŒtzen können
- âŠ