15,771 research outputs found

    MAGIC: Manipulating Avatars and Gestures to Improve Remote Collaboration

    Get PDF
    Remote collaborative work has become pervasive in many settings, from engineering to medical professions. Users are immersed in virtual environments and communicate through life-sized avatars that enable face-to-face collaboration. Within this context, users often collaboratively view and interact with virtual 3D models, for example, to assist in designing new devices such as customized prosthetics, vehicles, or buildings. However, discussing shared 3D content face-to-face has various challenges, such as ambiguities, occlusions, and different viewpoints that all decrease mutual awareness, leading to decreased task performance and increased errors. To address this challenge, we introduce MAGIC, a novel approach for understanding pointing gestures in a face-to-face shared 3D space, improving mutual understanding and awareness. Our approach distorts the remote user\'s gestures to correctly reflect them in the local user\'s reference space when face-to-face. We introduce a novel metric called pointing agreement to measure what two users perceive in common when using pointing gestures in a shared 3D space. Results from a user study suggest that MAGIC significantly improves pointing agreement in face-to-face collaboration settings, improving co-presence and awareness of interactions performed in the shared space. We believe that MAGIC improves remote collaboration by enabling simpler communication mechanisms and better mutual awareness.Comment: Presented at IEEE VR 202

    Conceitos e métodos para apoio ao desenvolvimento e avaliação de colaboração remota utilizando realidade aumentada

    Get PDF
    Remote Collaboration using Augmented Reality (AR) shows great potential to establish a common ground in physically distributed scenarios where team-members need to achieve a shared goal. However, most research efforts in this field have been devoted to experiment with the enabling technology and propose methods to support its development. As the field evolves, evaluation and characterization of the collaborative process become an essential, but difficult endeavor, to better understand the contributions of AR. In this thesis, we conducted a critical analysis to identify the main limitations and opportunities of the field, while situating its maturity and proposing a roadmap of important research actions. Next, a human-centered design methodology was adopted, involving industrial partners to probe how AR could support their needs during remote maintenance. These outcomes were combined with literature methods into an AR-prototype and its evaluation was performed through a user study. From this, it became clear the necessity to perform a deep reflection in order to better understand the dimensions that influence and must/should be considered in Collaborative AR. Hence, a conceptual model and a humancentered taxonomy were proposed to foster systematization of perspectives. Based on the model proposed, an evaluation framework for contextualized data gathering and analysis was developed, allowing support the design and performance of distributed evaluations in a more informed and complete manner. To instantiate this vision, the CAPTURE toolkit was created, providing an additional perspective based on selected dimensions of collaboration and pre-defined measurements to obtain “in situ” data about them, which can be analyzed using an integrated visualization dashboard. The toolkit successfully supported evaluations of several team-members during tasks of remote maintenance mediated by AR. Thus, showing its versatility and potential in eliciting a comprehensive characterization of the added value of AR in real-life situations, establishing itself as a generalpurpose solution, potentially applicable to a wider range of collaborative scenarios.Colaboração Remota utilizando Realidade Aumentada (RA) apresenta um enorme potencial para estabelecer um entendimento comum em cenários onde membros de uma equipa fisicamente distribuídos precisam de atingir um objetivo comum. No entanto, a maioria dos esforços de investigação tem-se focado nos aspetos tecnológicos, em fazer experiências e propor métodos para apoiar seu desenvolvimento. À medida que a área evolui, a avaliação e caracterização do processo colaborativo tornam-se um esforço essencial, mas difícil, para compreender as contribuições da RA. Nesta dissertação, realizámos uma análise crítica para identificar as principais limitações e oportunidades da área, ao mesmo tempo em que situámos a sua maturidade e propomos um mapa com direções de investigação importantes. De seguida, foi adotada uma metodologia de Design Centrado no Humano, envolvendo parceiros industriais de forma a compreender como a RA poderia responder às suas necessidades em manutenção remota. Estes resultados foram combinados com métodos da literatura num protótipo de RA e a sua avaliação foi realizada com um caso de estudo. Ficou então clara a necessidade de realizar uma reflexão profunda para melhor compreender as dimensões que influenciam e devem ser consideradas na RA Colaborativa. Foram então propostos um modelo conceptual e uma taxonomia centrada no ser humano para promover a sistematização de perspetivas. Com base no modelo proposto, foi desenvolvido um framework de avaliação para recolha e análise de dados contextualizados, permitindo apoiar o desenho e a realização de avaliações distribuídas de forma mais informada e completa. Para instanciar esta visão, o CAPTURE toolkit foi criado, fornecendo uma perspetiva adicional com base em dimensões de colaboração e medidas predefinidas para obter dados in situ, que podem ser analisados utilizando o painel de visualização integrado. O toolkit permitiu avaliar com sucesso vários colaboradores durante a realização de tarefas de manutenção remota apoiada por RA, permitindo mostrar a sua versatilidade e potencial em obter uma caracterização abrangente do valor acrescentado da RA em situações da vida real. Sendo assim, estabelece-se como uma solução genérica, potencialmente aplicável a uma gama diversificada de cenários colaborativos.Programa Doutoral em Engenharia Informátic

    Designing to Support Workspace Awareness in Remote Collaboration using 2D Interactive Surfaces

    Get PDF
    Increasing distributions of the global workforce are leading to collaborative workamong remote coworkers. The emergence of such remote collaborations is essentiallysupported by technology advancements of screen-based devices ranging from tabletor laptop to large displays. However, these devices, especially personal and mobilecomputers, still suffer from certain limitations caused by their form factors, that hinder supporting workspace awareness through non-verbal communication suchas bodily gestures or gaze. This thesis thus aims to design novel interfaces andinteraction techniques to improve remote coworkers’ workspace awareness throughsuch non-verbal cues using 2D interactive surfaces.The thesis starts off by exploring how visual cues support workspace awareness infacilitated brainstorming of hybrid teams of co-located and remote coworkers. Basedon insights from this exploration, the thesis introduces three interfaces for mobiledevices that help users maintain and convey their workspace awareness with their coworkers. The first interface is a virtual environment that allows a remote person to effectively maintain his/her awareness of his/her co-located collaborators’ activities while interacting with the shared workspace. To help a person better express his/her hand gestures in remote collaboration using a mobile device, the second interfacepresents a lightweight add-on for capturing hand images on and above the device’sscreen; and overlaying them on collaborators’ device to improve their workspace awareness. The third interface strategically leverages the entire screen space of aconventional laptop to better convey a remote person’s gaze to his/her co-locatedcollaborators. Building on the top of these three interfaces, the thesis envisions an interface that supports a person using a mobile device to effectively collaborate with remote coworkers working with a large display.Together, these interfaces demonstrate the possibilities to innovate on commodity devices to offer richer non-verbal communication and better support workspace awareness in remote collaboration

    Periscope: A Robotic Camera System to Support Remote Physical Collaboration

    Full text link
    We investigate how robotic camera systems can offer new capabilities to computer-supported cooperative work through the design, development, and evaluation of a prototype system called Periscope. With Periscope, a local worker completes manipulation tasks with guidance from a remote helper who observes the workspace through a camera mounted on a semi-autonomous robotic arm that is co-located with the worker. Our key insight is that the helper, the worker, and the robot should all share responsibility of the camera view--an approach we call shared camera control. Using this approach, we present a set of modes that distribute the control of the camera between the human collaborators and the autonomous robot depending on task needs. We demonstrate the system's utility and the promise of shared camera control through a preliminary study where 12 dyads collaboratively worked on assembly tasks. Finally, we discuss design and research implications of our work for future robotic camera systems that facilitate remote collaboration.Comment: This is a pre-print of the article accepted for publication in PACM HCI and will be presented at CSCW 202

    Exploring User Behaviour in Asymmetric Collaborative Mixed Reality

    Get PDF
    A common issue for collaborative mixed reality is the asymmetry of interaction with the shared virtual environment. For example, an augmented reality (AR) user might use one type of head-mounted display (HMD) in a physical environment, while a virtual reality (VR) user might wear a different type of HMD and see a virtual model of that physical environment. To explore the effects of such asymmetric interfaces on collaboration we present a study that investigates the behaviour of dyads performing a word puzzle task where one uses AR and the other VR. We examined the collaborative process through questionnaires and behavioural measures based on positional and audio data. We identified relationships between presence and co-presence, accord and co-presence, leadership and talkativeness, head rotation velocity and leadership, and head rotation velocity and talkativeness. We did not find that AR or VR biased subjective responses, though there were interesting behavioural differences: AR users spoke more words, AR users had a higher median head rotation velocity, and VR users travelled further

    Online Student Validation

    Get PDF
    A comprehensive evaluation of La Salle University’s online authentication systems and processes with recommendations for enhancement. With the Higher Education Opportunity Act, the U.S. Secretary of Education has instituted new requirements for recognition of students enrolled in online programs. This recognition requirement applies to validation that the enrolled student is the same person who completes online assignments, testing and coursework. In order to comply with this requirement, programs must prove that they adhere to their university guidelines for validating online course work or by creating their own policies. This paper first describes online education including its benefits, challenges and problems with cheating, especially in light of federal and accreditation requirements. This paper follows the process that a university technology director would use to evaluate an opportunity for change and the benefits that the change would provide. The paper then examines La Salle University’s online education, technologies and online class design and support resources. The implications regarding online policies, academic integrity and privacy are considered. Solutions to prevent online cheating are described through best practice strategies and technology solutions. The recommendations in this document consider the current accreditation requirements issued by the U.S Secretary of Education through the Higher Education Opportunity Act (HEOA.) Several technological solutions are evaluated and compared, focusing on the features that the solutions provide, cost, and the ease of integration into LaSalle University’s current systems. Finally, this paper evaluates existing processes and provides recommendations for process optimization

    A mixed reality telepresence system for collaborative space operation

    Get PDF
    This paper presents a Mixed Reality system that results from the integration of a telepresence system and an application to improve collaborative space exploration. The system combines free viewpoint video with immersive projection technology to support non-verbal communication, including eye gaze, inter-personal distance and facial expression. Importantly, these can be interpreted together as people move around the simulation, maintaining natural social distance. The application is a simulation of Mars, within which the collaborators must come to agreement over, for example, where the Rover should land and go. The first contribution is the creation of a Mixed Reality system supporting contextualization of non-verbal communication. Tw technological contributions are prototyping a technique to subtract a person from a background that may contain physical objects and/or moving images, and a light weight texturing method for multi-view rendering which provides balance in terms of visual and temporal quality. A practical contribution is the demonstration of pragmatic approaches to sharing space between display systems of distinct levels of immersion. A research tool contribution is a system that allows comparison of conventional authored and video based reconstructed avatars, within an environment that encourages exploration and social interaction. Aspects of system quality, including the communication of facial expression and end-to-end latency are reported

    Real Time Cross Platform Collaboration Between Virtual Reality & Mixed Reality

    Get PDF
    abstract: Virtual Reality (hereafter VR) and Mixed Reality (hereafter MR) have opened a new line of applications and possibilities. Amidst a vast network of potential applications, little research has been done to provide real time collaboration capability between users of VR and MR. The idea of this thesis study is to develop and test a real time collaboration system between VR and MR. The system works similar to a Google document where two or more users can see what others are doing i.e. writing, modifying, viewing, etc. Similarly, the system developed during this study will enable users in VR and MR to collaborate in real time. The study of developing a real-time cross-platform collaboration system between VR and MR takes into consideration a scenario in which multiple device users are connected to a multiplayer network where they are guided to perform various tasks concurrently. Usability testing was conducted to evaluate participant perceptions of the system. Users were required to assemble a chair in alternating turns; thereafter users were required to fill a survey and give an audio interview. Results collected from the participants showed positive feedback towards using VR and MR for collaboration. However, there are several limitations with the current generation of devices that hinder mass adoption. Devices with better performance factors will lead to wider adoption.Dissertation/ThesisFinal OutputKeynoteThesis DemoMasters Thesis Computer Science 201
    corecore