2,183 research outputs found

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, führen zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhärent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natürliche Interaktionstechniken als hilfreich für die Datenanalyse erwiesen. Darüber hinaus spielt in solchen Anwendungsfällen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext für die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung geführt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion für diese oft komplexen Systeme. In meiner Dissertation beschäftige ich mich mit dieser Herausforderung, indem ich die Interaktion für immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von räumlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann räumliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen für immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. Für die zweite Frage untersuche ich, wie insbesondere die räumliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit räumlichen Geräten im Vergleich zur Touch-Eingabe, die Verwendung zusätzlicher mobiler Geräte als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darüber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie räumliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstützen können

    3D Cadastres Best Practices, Chapter 5: Visualization and New Opportunities

    Get PDF
    This paper proposes a discussion on opportunities offered by 3D visualization to improve the understanding and the analysis of cadastre data. It first introduce the rationale of having 3D visualization functionalities in the context of cadastre applications. Second the publication outline some basic concepts in 3D visualization. This section specially addresses the visualization pipeline as a driven classification schema to understand the steps leading to 3D visualization. In this section is also presented a brief review of current 3D standards and technologies. Next is proposed a summary of progress made in the last years in 3D cadastral visualization. For instance, user’s requirement, data and semiotics, and platforms are highlighted as main actions performed in the development of 3D cadastre visualization. This review could be perceived as an attempt to structure and emphasise the best practices in the domain of 3D cadastre visualization and as an inventory of issues that still need to be tackled. Finally, by providing a review on advances and trends in 3D visualization, the paper initiates a discussion and a critical analysis on the benefit of applying these new developments to cadastre domain. This final section discusses about enhancing 3D techniques as dynamic transparency and cutaway, 3D generalization, 3D visibility model, 3D annotation, 3D data and web platform, augmented reality, immersive virtual environment, 3D gaming, interaction techniques and time

    Leveraging eXtented Reality & Human-Computer Interaction for User Experi- ence in 360◦ Video

    Get PDF
    EXtended Reality systems have resurged as a medium for work and entertainment. While 360o video has been characterized as less immersive than computer-generated VR, its realism, ease of use and affordability mean it is in widespread commercial use. Based on the prevalence and potential of the 360o video format, this research is focused on improving and augmenting the user experience of watching 360o video. By leveraging knowledge from Extented Reality (XR) systems and Human-Computer Interaction (HCI), this research addresses two issues affecting user experience in 360o video: Attention Guidance and Visually Induced Motion Sickness (VIMS). This research work relies on the construction of multiple artifacts to answer the de- fined research questions: (1) IVRUX, a tool for analysis of immersive VR narrative expe- riences; (2) Cue Control, a tool for creation of spatial audio soundtracks for 360o video, as well as enabling the collection and analysis of captured metrics emerging from the user experience; and (3) VIMS mitigation pipeline, a linear sequence of modules (including optical flow and visual SLAM among others) that control parameters for visual modi- fications such as a restricted Field of View (FoV). These artifacts are accompanied by evaluation studies targeting the defined research questions. Through Cue Control, this research shows that non-diegetic music can be spatialized to act as orientation for users. A partial spatialization of music was deemed ineffective when used for orientation. Addi- tionally, our results also demonstrate that diegetic sounds are used for notification rather than orientation. Through VIMS mitigation pipeline, this research shows that dynamic restricted FoV is statistically significant in mitigating VIMS, while mantaining desired levels of Presence. Both Cue Control and the VIMS mitigation pipeline emerged from a Research through Design (RtD) approach, where the IVRUX artifact is the product of de- sign knowledge and gave direction to research. The research presented in this thesis is of interest to practitioners and researchers working on 360o video and helps delineate future directions in making 360o video a rich design space for interaction and narrative.Sistemas de Realidade EXtendida ressurgiram como um meio de comunicação para o tra- balho e entretenimento. Enquanto que o vídeo 360o tem sido caracterizado como sendo menos imersivo que a Realidade Virtual gerada por computador, o seu realismo, facili- dade de uso e acessibilidade significa que tem uso comercial generalizado. Baseado na prevalência e potencial do formato de vídeo 360o, esta pesquisa está focada em melhorar e aumentar a experiência de utilizador ao ver vídeos 360o. Impulsionado por conhecimento de sistemas de Realidade eXtendida (XR) e Interacção Humano-Computador (HCI), esta pesquisa aborda dois problemas que afetam a experiência de utilizador em vídeo 360o: Orientação de Atenção e Enjoo de Movimento Induzido Visualmente (VIMS). Este trabalho de pesquisa é apoiado na construção de múltiplos artefactos para res- ponder as perguntas de pesquisa definidas: (1) IVRUX, uma ferramenta para análise de experiências narrativas imersivas em VR; (2) Cue Control, uma ferramenta para a criação de bandas sonoras de áudio espacial, enquanto permite a recolha e análise de métricas capturadas emergentes da experiencia de utilizador; e (3) canal para a mitigação de VIMS, uma sequência linear de módulos (incluindo fluxo ótico e SLAM visual entre outros) que controla parâmetros para modificações visuais como o campo de visão restringido. Estes artefactos estão acompanhados por estudos de avaliação direcionados para às perguntas de pesquisa definidas. Através do Cue Control, esta pesquisa mostra que música não- diegética pode ser espacializada para servir como orientação para os utilizadores. Uma espacialização parcial da música foi considerada ineficaz quando usada para a orientação. Adicionalmente, os nossos resultados demonstram que sons diegéticos são usados para notificação em vez de orientação. Através do canal para a mitigação de VIMS, esta pesquisa mostra que o campo de visão restrito e dinâmico é estatisticamente significante ao mitigar VIMS, enquanto mantem níveis desejados de Presença. Ambos Cue Control e o canal para a mitigação de VIMS emergiram de uma abordagem de Pesquisa através do Design (RtD), onde o artefacto IVRUX é o produto de conhecimento de design e deu direcção à pesquisa. A pesquisa apresentada nesta tese é de interesse para profissionais e investigadores tra- balhando em vídeo 360o e ajuda a delinear futuras direções em tornar o vídeo 360o um espaço de design rico para a interação e narrativa

    Earth as Interface: Exploring chemical senses with Multisensory HCI Design for Environmental Health Communication

    Get PDF
    As environmental problems intensify, the chemical senses -that is smell and taste, are the most relevantsenses to evidence them.As such, environmental exposure vectors that can reach human beings comprise air,food, soil and water[1].Within this context, understanding the link between environmental exposures andhealth[2]is crucial to make informed choices, protect the environment and adapt to new environmentalconditions[3].Smell and taste lead therefore to multi-sensorial experiences which convey multi-layered information aboutlocal and global events[4]. However, these senses are usually absent when those problems are represented indigital systems. The multisensory HCIdesign framework investigateschemical sense inclusion withdigital systems[5]. Ongoing efforts tackledigitalization of smell and taste for digital delivery, transmission or substitution [6]. Despite experimentsproved technological feasibility, its dissemination depends on relevant applicationdevelopment[7].This thesis aims to fillthose gaps by demonstratinghow chemical senses provide the means to link environment and health based on scientific andgeolocation narratives [8], [9],[10]. We present a Multisensory HCI design process which accomplished symbolicdisplaying smell and taste and led us to a new multi-sensorial interaction system presented herein. We describe the conceptualization, design and evaluation of Earthsensum, an exploratory case study project.Earthsensumoffered to 16 participants in the study, environmental smell and taste experiences about real geolocations to participants of the study. These experiences were represented digitally using mobilevirtual reality (MVR) and mobile augmented reality (MAR). Its technologies bridge the real and digital Worlds through digital representations where we can reproduce the multi-sensorial experiences. Our study findings showed that the purposed interaction system is intuitive and can lead not only to a betterunderstanding of smell and taste perception as also of environmental problems. Participants comprehensionabout the link between environmental exposures and health was successful and they would recommend thissystem as education tools. Our conceptual design approach was validated and further developments wereencouraged.In this thesis,we demonstratehow to applyMultisensory HCI methodology to design with chemical senses. Weconclude that the presented symbolic representation model of smell and taste allows communicatingtheseexperiences on digital platforms. Due to its context-dependency, MVR and MAR platforms are adequatetechnologies to be applied for this purpose.Future developments intend to explore further the conceptual approach. These developments are centredon the use of the system to induce hopefully behaviourchange. Thisthesisopens up new application possibilities of digital chemical sense communication,Multisensory HCI Design and environmental health communication.À medida que os problemas ambientais se intensificam, os sentidos químicos -isto é, o cheiroe sabor, são os sentidos mais relevantes para evidenciá-los. Como tais, os vetores de exposição ambiental que podem atingir os seres humanos compreendem o ar, alimentos, solo e água [1]. Neste contexto, compreender a ligação entre as exposições ambientais e a saúde [2] é crucial para exercerescolhas informadas, proteger o meio ambiente e adaptar a novas condições ambientais [3]. O cheiroe o saborconduzemassima experiências multissensoriais que transmitem informações de múltiplas camadas sobre eventos locais e globais [4]. No entanto, esses sentidos geralmente estão ausentes quando esses problemas são representados em sistemas digitais. A disciplina do design de Interação Humano-Computador(HCI)multissensorial investiga a inclusão dossentidos químicos em sistemas digitais [9]. O seu foco atual residena digitalização de cheirose sabores para o envio, transmissão ou substituiçãode sentidos[10]. Apesar dasexperimentaçõescomprovarem a viabilidade tecnológica, a sua disseminação está dependentedo desenvolvimento de aplicações relevantes [11]. Estatese pretendepreencher estas lacunas ao demonstrar como os sentidos químicos explicitama interconexãoentre o meio ambiente e a saúde, recorrendo a narrativas científicas econtextualizadasgeograficamente[12], [13], [14]. Apresentamos uma metodologiade design HCImultissensorial que concretizouum sistema de representação simbólica de cheiro e sabor e nos conduziu a um novo sistema de interação multissensorial, que aqui apresentamos. Descrevemos o nosso estudo exploratório Earthsensum, que integra aconceptualização, design e avaliação. Earthsensumofereceu a 16participantes do estudo experiências ambientais de cheiro e sabor relacionadas com localizações geográficasreais. Essas experiências foram representadas digitalmente através derealidade virtual(VR)e realidade aumentada(AR).Estas tecnologias conectamo mundo real e digital através de representações digitais onde podemos reproduzir as experiências multissensoriais. Os resultados do nosso estudo provaramque o sistema interativo proposto é intuitivo e pode levar não apenas a uma melhor compreensão da perceção do cheiroe sabor, como também dos problemas ambientais. O entendimentosobre a interdependência entre exposições ambientais e saúde teve êxitoe os participantes recomendariam este sistema como ferramenta para aeducação. A nossa abordagem conceptual foi positivamentevalidadae novos desenvolvimentos foram incentivados. Nesta tese, demonstramos como aplicar metodologiasde design HCImultissensorialpara projetar com ossentidos químicos. Comprovamosque o modelo apresentado de representação simbólica do cheiroe do saborpermite comunicar essas experiênciasem plataformas digitais. Por serem dependentesdocontexto, as plataformas de aplicações emVR e AR são tecnologias adequadaspara este fim.Desenvolvimentos futuros pretendem aprofundar a nossa abordagemconceptual. Em particular, aspiramos desenvolvera aplicaçãodo sistema para promover mudanças de comportamento. Esta tese propõenovas possibilidades de aplicação da comunicação dos sentidos químicos em plataformas digitais, dedesign multissensorial HCI e de comunicação de saúde ambiental
    corecore