4,655 research outputs found

    Co-Located Collaborative Visual Analytics around a Tabletop Display

    Full text link

    Collocated Collaboration Analytics: Principles and Dilemmas for Mining Multimodal Interaction Data

    Full text link
    © 2019, Copyright © 2017 Taylor & Francis Group, LLC. Learning to collaborate effectively requires practice, awareness of group dynamics, and reflection; often it benefits from coaching by an expert facilitator. However, in physical spaces it is not always easy to provide teams with evidence to support collaboration. Emerging technology provides a promising opportunity to make collocated collaboration visible by harnessing data about interactions and then mining and visualizing it. These collocated collaboration analytics can help researchers, designers, and users to understand the complexity of collaboration and to find ways they can support collaboration. This article introduces and motivates a set of principles for mining collocated collaboration data and draws attention to trade-offs that may need to be negotiated en route. We integrate Data Science principles and techniques with the advances in interactive surface devices and sensing technologies. We draw on a 7-year research program that has involved the analysis of six group situations in collocated settings with more than 500 users and a variety of surface technologies, tasks, grouping structures, and domains. The contribution of the article includes the key insights and themes that we have identified and summarized in a set of principles and dilemmas that can inform design of future collocated collaboration analytics innovations

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, fĂŒhren zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhĂ€rent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natĂŒrliche Interaktionstechniken als hilfreich fĂŒr die Datenanalyse erwiesen. DarĂŒber hinaus spielt in solchen AnwendungsfĂ€llen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext fĂŒr die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung gefĂŒhrt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion fĂŒr diese oft komplexen Systeme. In meiner Dissertation beschĂ€ftige ich mich mit dieser Herausforderung, indem ich die Interaktion fĂŒr immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von rĂ€umlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann rĂ€umliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen fĂŒr immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. FĂŒr die zweite Frage untersuche ich, wie insbesondere die rĂ€umliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit rĂ€umlichen GerĂ€ten im Vergleich zur Touch-Eingabe, die Verwendung zusĂ€tzlicher mobiler GerĂ€te als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darĂŒber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie rĂ€umliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstĂŒtzen können

    A student-facing dashboard for supporting sensemaking about the brainstorm process at a multi-surface space

    Full text link
    © 2017 Association for Computing Machinery. All rights reserved. We developed a student-facing dashboard tuned to support posthoc sensemaking in terms of participation and group effects in the context of collocated brainstorming. Grounding on foundations of small-group collaboration, open learner modelling and brainstorming at large interactive displays, we designed a set of models from behavioural data that can be visually presented to students. We validated the effectiveness of our dashboard in provoking group reflection by addressing two questions: (1) What do group members gain from studying measures of egalitarian contribution? and (2) What do group members gain from modelling how they sparked ideas off each other? We report on outcomes from a study with higher education students performing brainstorming. We present evidence from i) descriptive quantitative usage patterns; and ii) qualitative experiential descriptions reported by the students. We conclude the paper with a discussion that can be useful for the community in the design of collective reflection systems

    Grand Challenges in Immersive Analytics

    Get PDF
    The definitive version will be published in CHI 2021, May 8–13, 2021, Yokohama, JapanInternational audienceImmersive Analytics is a quickly evolving field that unites several areas such as visualisation, immersive environments, and humancomputer interaction to support human data analysis with emerging technologies. This research has thrived over the past years with multiple workshops, seminars, and a growing body of publications, spanning several conferences. Given the rapid advancement of interaction technologies and novel application domains, this paper aims toward a broader research agenda to enable widespread adoption. We present 17 key research challenges developed over multiple sessions by a diverse group of 24 international experts, initiated from a virtual scientific workshop at ACM CHI 2020. These challenges aim to coordinate future work by providing a systematic roadmap of current directions and impending hurdles to facilitate productive and effective applications for Immersive Analytics

    Nu-view: A visualization system for collaborative Co-located analysis of geospatial disease data

    Get PDF
    In general, many factors contribute to the spread of diseases among populations over large geographical areas. In practice, analysis of these factors typically requires expertise of multidisciplinary teams. In this paper, we present a visualization system which aims to support the visual analytics process involving multidisciplinary teams of analysts in colocated collaborative settings. The current prototype system allows coupled and decoupled modes of interaction, using a combination of personal visualizations on private small displays and group visualizations on a shared large display. We have conducted preliminary fieldwork and a review study of this prototype with a group of medical experts who have provided feedback on the current system and suggestions for other usage scenarios, as well as further improvements. We found that our target user group have a generally positive attitude towards the use of a shared display with support for the suggested interaction modes, even though these modes are substantially different from the way their groups currently conduct synchronous collaboration, and that additional support for sharing image and textual data over the geospatial data layer may be required
    • 

    corecore