46,918 research outputs found
Enabling collaboration in virtual reality navigators
In this paper we characterize a feature superset for Collaborative
Virtual Reality Environments (CVRE), and derive a component
framework to transform stand-alone VR navigators into full-fledged
multithreaded collaborative environments. The contributions of our
approach rely on a cost-effective and extensible technique for
loading software components into separate POSIX threads for
rendering, user interaction and network communications, and adding a
top layer for managing session collaboration. The framework recasts
a VR navigator under a distributed peer-to-peer topology for scene
and object sharing, using callback hooks for broadcasting remote
events and multicamera perspective sharing with avatar interaction.
We validate the framework by applying it to our own ALICE VR
Navigator. Experimental results show that our approach has good
performance in the collaborative inspection of complex models.Postprint (published version
Mixed reality participants in smart meeting rooms and smart home enviroments
Human–computer interaction requires modeling of the user. A user profile typically contains preferences, interests, characteristics, and interaction behavior. However, in its multimodal interaction with a smart environment the user displays characteristics that show how the user, not necessarily consciously, verbally and nonverbally provides the smart environment with useful input and feedback. Especially in ambient intelligence environments we encounter situations where the environment supports interaction between the environment, smart objects (e.g., mobile robots, smart furniture) and human participants in the environment. Therefore it is useful for the profile to contain a physical representation of the user obtained by multi-modal capturing techniques. We discuss the modeling and simulation of interacting participants in a virtual meeting room, we discuss how remote meeting participants can take part in meeting activities and they have some observations on translating research results to smart home environments
Using visual analytics to develop situation awareness in astrophysics
We present a novel collaborative visual analytics application for cognitively overloaded users in the astrophysics domain. The system was developed for scientists who need to analyze heterogeneous, complex data under time pressure, and make predictions and time-critical decisions rapidly and correctly under a constant influx of changing data. The Sunfall Data Taking system utilizes several novel visualization and analysis techniques to enable a team of geographically distributed domain specialists to effectively and remotely maneuver a custom-built instrument under challenging operational conditions. Sunfall Data Taking has been in production use for 2 years by a major international astrophysics collaboration (the largest data volume supernova search currently in operation), and has substantially improved the operational efficiency of its users. We describe the system design process by an interdisciplinary team, the system architecture and the results of an informal usability evaluation of the production system by domain experts in the context of Endsley's three levels of situation awareness
From Big Data to Big Displays: High-Performance Visualization at Blue Brain
Blue Brain has pushed high-performance visualization (HPV) to complement its
HPC strategy since its inception in 2007. In 2011, this strategy has been
accelerated to develop innovative visualization solutions through increased
funding and strategic partnerships with other research institutions.
We present the key elements of this HPV ecosystem, which integrates C++
visualization applications with novel collaborative display systems. We
motivate how our strategy of transforming visualization engines into services
enables a variety of use cases, not only for the integration with high-fidelity
displays, but also to build service oriented architectures, to link into web
applications and to provide remote services to Python applications.Comment: ISC 2017 Visualization at Scale worksho
A Web2.0 Strategy for the Collaborative Analysis of Complex Bioimages
Loyek C, Kölling J, Langenkämper D, Niehaus K, Nattkemper TW. A Web2.0 Strategy for the Collaborative Analysis of Complex Bioimages. In: Gama J, Bradley E, Hollmén J, eds. Advances in Intelligent Data Analysis X: 10th International Symposium, IDA 2011, Porto, Portugal, October 29-31, 2011. Proceedings. Lecture Notes in Computer Science. Vol 7014. Berlin, Heidelberg: Springer; 2011: 258-269
Recommended from our members
Out there and in here: design for blended scientific inquiry learning
One of the benefits of mobile technologies is to combine ‘the digital’ (e.g., data, information, photos) with ‘field’ experiences in novel ways that are contextualized by people’s current located activities. However, often cost, mobility disabilities and time exclude students from engaging in such peripatetic experiences. The Out There and In Here project, is exploring a combination of mobile and tabletop technologies in support for collaborative learning. A system is being developed for synchronous collaboration between geology students in the field and peers at an indoor location. The overarching goal of this research is to develop technologies that support people working together in a suitable manner for their locations. There are two OTIH project research threads. The first deals with disabled learner access issues: these complex issues are being reviewed in subsequent evaluations and publications. This paper will deal with issues of technology supported learning design for remote and co-located science learners. Several stakeholder evaluations and two field trials have reviewed two research questions:
1. What will enhance the learning experience for those in the field and laboratory?
2. How can learning trajectories and appropriate technologies be designed to support equitable co-located and remote learning collaboration?
This paper focuses on describing the iterative linked development of technologies and scientific inquiry pedagogy. Two stages within the research project are presented. The 1st stage details several pilot studies over 3 years with 21 student participants in synchronous collaborations with traditional technology and pedagogical models. Findings revealed that this was an engaging and useful experience although issues of equity in collaboration needed further research. The 2nd stage, in this project, has been to evaluate data from over 25 stakeholders (academics, learning and technology designers) to develop pervasive ambient technological solutions supporting orchestration of mixed levels of pedagogy (i.e. abstract synthesis to specific investigation). Middleware between tabletop ‘surface’ technologies and mobile devices are being designed with Microsoft and OOKL (a mobile software company) to support these developments. Initial findings reveal issues around equity, ownership and professional identity
- …