18 research outputs found
Estimating the Pose of Phicons for Man
Physical icons (phicons) are ordinary objects that can serveas user interface in an intelligentenvironment. This article addresses the problem of recognizing the position and orientation of such objects
Using the Concept of Augmented Reality as a Vehicle for Transcending the Desktop Tarpit
of technical substrates for mixed environments: augmenting the user, the physical object and the environment. These strategies describe the technical locus of the interface assuming the analytical separation of function and interaction in the computer artefact. However, the way we have employed the augmented reality principles goes far beyond the original purpose, as we have used them as a tool for divergent thinking, a kind of metaphor or springboard (refs??). We abstracted defining features from the three directions in augmented reality interfaces and applied them in the different technical settings. Subsequently, the principles were further investigated through future scenarios of PDA support for wastewater treatment work built on the augmented reality classification transformed to small mobile interfaces. We developed future scenarios for wastewater treatment work with PDA applications developed by using the technical classification of augmented reality interfaces as a thinking to
Component-based high fidelity interactive prototyping of post-WIMP interactions
In order to support interactive high-fidelity prototyping of post-WIMP user interactions, we propose a multi-fidelity design method based on a unifying component-based model and supported by an advanced tool suite, the OpenInterface Platform Workbench. Our approach strives for supporting a collaborative (programmer-designer) and user-centered design activity. The workbench architecture allows exploration of novel interaction techniques through seamless integration and adaptation of heterogeneous components, high-fidelity rapid prototyping, runtime evaluation and fine-tuning of designed systems. This paper illustrates through the iterative construction of a running example how OpenInterface allows the leverage of existing resources and fosters the creation of non-conventional interaction techniques
Benchmark movement data set for trust assessment in human robot collaboration
In the Drapebot project, a worker is supposed to collaborate with a large industrial manipulator in two tasks: collaborative transport of carbon fibre patches and collaborative draping. To realize data-driven trust assessement, the worker is equipped with a motion tracking suit and the body movement data is labeled with the trust scores from a standard Trust questionnaire
Benchmark EEG data set for trust assessment for interactions with social robots
The data collection consisted of a game interaction with a small humanoid EZ-robot. The robot explains a word to the participant either through movements depicting the concept or by verbal description. Depending on their performance, participants could "earn" or loose candy as remuneration for their participation. The dataset comprises EEG (Electroencephalography) recordings from 21 participants, gathered using Emotiv headsets. Each participant's EEG data includes timestamps and measurements from 14 sensors placed across different regions of the scalp. The sensor labels in the header are as follows: EEG.AF3, EEG.F7, EEG.F3, EEG.FC5, EEG.T7, EEG.P7, EEG.O1, EEG.O2, EEG.P8, EEG.T8, EEG.FC6, EEG.F4, EEG.F8, EEG.AF4, and Time
