9,696 research outputs found

    On Inter-referential Awareness in Collaborative Augmented Reality

    Get PDF
    For successful collaboration to occur, a workspace must support inter-referential awareness - or the ability for one participant to refer to a set of artifacts in the environment, and for that reference to be correctly interpreted by others. While referring to objects in our everyday environment is a straight-forward task, the non-tangible nature of digital artifacts presents us with new interaction challenges. Augmented reality (AR) is inextricably linked to the physical world, and it is natural to believe that the re-integration of physical artifacts into the workspace makes referencing tasks easier; however, we find that these environments combine the referencing challenges from several computing disciplines, which compound across scenarios. This dissertation presents our studies of this form of awareness in collaborative AR environments. It stems from our research in developing mixed reality environments for molecular modeling, where we explored spatial and multi-modal referencing techniques. To encapsulate the myriad of factors found in collaborative AR, we present a generic, theoretical framework and apply it to analyze this domain. Because referencing is a very human-centric activity, we present the results of an exploratory study which examines the behaviors of participants and how they generate references to physical and virtual content in co-located and remote scenarios; we found that participants refer to content using physical and virtual techniques, and that shared video is highly effective in disambiguating references in remote environments. By implementing user feedback from this study, a follow-up study explores how the environment can passively support referencing, where we discovered the role that virtual referencing plays during collaboration. A third study was conducted in order to better understand the effectiveness of giving and interpreting references using a virtual pointer; the results suggest the need for participants to be parallel with the arrow vector (strengthening the argument for shared viewpoints), as well as the importance of shadows in non-stereoscopic environments. Our contributions include a framework for analyzing the domain of inter-referential awareness, the development of novel referencing techniques, the presentation and analysis of our findings from multiple user studies, and a set of guidelines to help designers support this form of awareness

    Collaborative behavior, performance and engagement with visual analytics tasks using mobile devices

    Get PDF
    Interactive visualizations are external tools that can support users’ exploratory activities. Collaboration can bring benefits to the exploration of visual representations or visu‐ alizations. This research investigates the use of co‐located collaborative visualizations in mobile devices, how users working with two different modes of interaction and view (Shared or Non‐Shared) and how being placed at various position arrangements (Corner‐to‐Corner, Face‐to‐Face, and Side‐by‐Side) affect their knowledge acquisition, engagement level, and learning efficiency. A user study is conducted with 60 partici‐ pants divided into 6 groups (2 modes×3 positions) using a tool that we developed to support the exploration of 3D visual structures in a collaborative manner. Our results show that the shared control and view version in the Side‐by‐Side position is the most favorable and can improve task efficiency. In this paper, we present the results and a set of recommendations that are derived from them

    Model-based groupware solution for distributed real-time collaborative 4D planning via teamwork

    Get PDF
    Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications

    A Novel Approach For Collaborative Interaction with Mixed Reality in Value Engineering

    Get PDF
    Design and engineering in real-world projects is often influenced by reduction of the problem definition, trade-offs during decision-making, possible loss of information and monetary issues like budget constraints or value-for-money problems. In many engineering projects various stakeholders take part in the project process on various levels of communication, engineering and decision-making. During project meetings and VE sessions between the different stakeholder’s, information and data is gathered and put down analogue and/or digitally, consequently stored in reports, minutes and other modes of representation. Results and conclusions derived from these interactions are often influenced by the user’s field of experience and expertise. Personal stakes, idiosyncrasy, expectations, preferences and interpretations of the various project parts could have implications, interfere or procrastinate non-functionality and possible rupture in the collaborative setting and process leading to diminished prospective project targets, requirements and solutions. We present a hybrid tool as a Virtual Assistant (VA) during a collaborative Value Engineering (VE) session in a real-world design and engineering case. The tool supports interaction and decision-making in conjunction with a physical workbench as focal point (-s), user-interfaces that intuit the user during processing. The hybrid environment allows the users to interact un-tethered with real-world materials, images, drawings, objects and drawing instruments. In course of the processing captures are made of the various topics or issues at stake and logged as iterative instances in a database. Real-time visualization on a monitor of the captured instances are shown and progressively listed in the on-screen user interface. During or after the session the stakeholders can go through the iterative time-listing and synthesize the instances according to i.e. topic, dominance, choice or to the degree of priority. After structuring and sorting the data sets the information can be exported to a data or video file. All stakeholders receive or have access to the data files and can track-back the complete process progression. The system and information generated affords reflection, knowledge sharing and cooperation. Redistribution of data sets to other stakeholders, management or third parties becomes more efficient and congruous. Our approach we took during this experiment was to [re]search the communication, interaction and decision-making progressions of the various stakeholders during the VE-session. We observed the behavioral aspects during the various stages of user interaction, following the decision making process and the use of the tool during the course of the session. We captured the complete session on video for analysis and evaluation of the VE process within a hybrid design environment

    Periscope: A Robotic Camera System to Support Remote Physical Collaboration

    Full text link
    We investigate how robotic camera systems can offer new capabilities to computer-supported cooperative work through the design, development, and evaluation of a prototype system called Periscope. With Periscope, a local worker completes manipulation tasks with guidance from a remote helper who observes the workspace through a camera mounted on a semi-autonomous robotic arm that is co-located with the worker. Our key insight is that the helper, the worker, and the robot should all share responsibility of the camera view--an approach we call shared camera control. Using this approach, we present a set of modes that distribute the control of the camera between the human collaborators and the autonomous robot depending on task needs. We demonstrate the system's utility and the promise of shared camera control through a preliminary study where 12 dyads collaboratively worked on assembly tasks. Finally, we discuss design and research implications of our work for future robotic camera systems that facilitate remote collaboration.Comment: This is a pre-print of the article accepted for publication in PACM HCI and will be presented at CSCW 202
    corecore