38,638 research outputs found
Collaboration in Augmented Reality: How to establish coordination and joint attention?
Schnier C, Pitsch K, Dierker A, Hermann T. Collaboration in Augmented Reality: How to establish coordination and joint attention? In: Boedker S, Bouvin NO, Lutters W, Wulf V, Ciolfi L, eds. Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Springer-Verlag London; 2011: 405-416.We present an initial investigation from a semi-experimental setting, in which
an HMD-based AR-system has been used for real-time collaboration in a task-oriented scenario (design of a museum exhibition). Analysis points out the specific conditions of interacting in an AR environment and focuses on one particular practical problem for the participants in coordinating their interaction: how to establish joint attention towards the same object or referent. Analysis allows insights into how the pair of users begins to
familarize with the environment, the limitations and opportunities of the setting and how they establish new routines for e.g. solving the ʻjoint attentionʌ-problem
Towards Simulating Humans in Augmented Multi-party Interaction
Human-computer interaction requires modeling of the user. A user profile typically contains preferences, interests, characteristics, and interaction behavior. However, in its multimodal interaction with a smart environment the user displays characteristics that show how the user, not necessarily consciously, verbally and nonverbally provides the smart environment with useful input and feedback. Especially in ambient intelligence environments we encounter situations where the environment supports interaction between the environment, smart objects (e.g., mobile robots, smart furniture) and human participants in the environment. Therefore it is useful for the profile to contain a physical representation of the user obtained by multi-modal capturing techniques. We discuss the modeling and simulation of interacting participants in the European AMI research project
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
H Space: Interactive Augmented Reality Art
open accessThis artwork exploits recent research into augmented reality systems, such as the HoloLens, for building creative interaction in augmented reality. The work is being conducted in the context of interactive art experiences. The first version of the audience experience of the artwork, âH Spaceâ, was informally tested in the SIGGRAPH 2018 Art Gallery context. Experiences with a later, improved, version was evaluated at Tsinghua University. The latest distributed version will be shown in Sydney. The paper describes the concept, the background in both the art and the technological domain and points to some of the key computer human interaction art research issues that the work highlights
An observational study of children interacting with an augmented story book
We present findings of an observational study investigating how young children interact with augmented reality story books. Children aged between 6 and 7 read and interacted with one of two story books aimed at early literacy education. The books pages were augmented using animated virtual 3D characters, sound, and interactive tasks. Introducing novel media to young children requires system and story designers to consider not only technological issues but also questions arising from story design and the design of interactive sequences. We discuss findings of our study and implications regarding the implementation of augmented story books
User-centred design of flexible hypermedia for a mobile guide: Reflections on the hyperaudio experience
A user-centred design approach involves end-users from the very beginning. Considering users at the early stages compels designers to think in terms of utility and usability and helps develop the system on what is actually needed. This paper discusses the case of HyperAudio, a context-sensitive adaptive and mobile guide to museums developed in the late 90s. User requirements were collected via a survey to understand visitorsâ profiles and visit styles in Natural Science museums. The knowledge acquired supported the specification of system requirements, helping defining user model, data structure and adaptive behaviour of the system. User requirements guided the design decisions on what could be implemented by using simple adaptable triggers and what instead needed more sophisticated adaptive techniques, a fundamental choice when all the computation must be done on a PDA. Graphical and interactive environments for developing and testing complex adaptive systems are discussed as a further
step towards an iterative design that considers the user interaction a central point. The paper discusses
how such an environment allows designers and developers to experiment with different systemâs behaviours and to widely test it under realistic conditions by simulation of the actual context evolving over time. The understanding gained in HyperAudio is then considered in the perspective of the
developments that followed that first experience: our findings seem still valid despite the passed time
New technology for interactive CAL: The origami project
Origami is a threeâyear EPSRC project that forms part of a general research programme on humanâcomputer interaction. The goal of this research is to investigate and implement new methods for humanâcomputer interaction, and to apply and evaluate their use. The research centres on the DigitalDesk, an ordinary desk augmented with a computer display using projection television and a video camera to monitor inputs. The DigitalDesk allows electronic and printed documents to be combined to give richer presentation and interaction possibilities than are possible with either separate medium. This paper examines the implications of such a system for CAL, and presents two prototype applications that demonstrate the possibilities
- âŠ