7,284 research outputs found

    Using Augmented Reality as a Medium to Assist Teaching in Higher Education

    Get PDF
    In this paper we describe the use of a high-level augmented reality (AR) interface for the construction of collaborative educational applications that can be used in practice to enhance current teaching methods. A combination of multimedia information including spatial three-dimensional models, images, textual information, video, animations and sound, can be superimposed in a student-friendly manner into the learning environment. In several case studies different learning scenarios have been carefully designed based on human-computer interaction principles so that meaningful virtual information is presented in an interactive and compelling way. Collaboration between the participants is achieved through use of a tangible AR interface that uses marker cards as well as an immersive AR environment which is based on software user interfaces (UIs) and hardware devices. The interactive AR interface has been piloted in the classroom at two UK universities in departments of Informatics and Information Science

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, fĂŒhren zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhĂ€rent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natĂŒrliche Interaktionstechniken als hilfreich fĂŒr die Datenanalyse erwiesen. DarĂŒber hinaus spielt in solchen AnwendungsfĂ€llen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext fĂŒr die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung gefĂŒhrt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion fĂŒr diese oft komplexen Systeme. In meiner Dissertation beschĂ€ftige ich mich mit dieser Herausforderung, indem ich die Interaktion fĂŒr immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von rĂ€umlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann rĂ€umliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen fĂŒr immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. FĂŒr die zweite Frage untersuche ich, wie insbesondere die rĂ€umliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit rĂ€umlichen GerĂ€ten im Vergleich zur Touch-Eingabe, die Verwendung zusĂ€tzlicher mobiler GerĂ€te als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darĂŒber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie rĂ€umliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstĂŒtzen können

    Visual communication in urban planning and urban design

    Get PDF
    This report documents the current status of visual communication in urban design and planning. Visual communication is examined through discussion of standalone and network media, specifically concentrating on visualisation on the World Wide Web(WWW).Firstly, we examine the use of Solid and Geometric Modelling for visualising urban planning and urban design. This report documents and compares examples of the use of Virtual Reality Modelling Language (VRML) and proprietary WWW based Virtual Reality modelling software. Examples include the modelling of Bath and Glasgow using both VRML 1.0 and 2.0. A review is carried out on the use of Virtual Worldsand their role in visualising urban form within multi-user environments. The use of Virtual Worlds is developed into a case study of the possibilities and limitations of Virtual Internet Design Arenas (ViDAs), an initiative undertaken at the Centre for Advanced Spatial Analysis, University College London. The use of Virtual Worlds and their development towards ViDAs is seen as one of the most important developments in visual communication for urban planning and urban design since the development plan.Secondly, photorealistic media in the process of communicating plans is examined.The process of creating photorealistic media is documented, examples of the Virtual Streetscape and Wired Whitehall Virtual Urban Interface System are provided. The conclusion is drawn that although the use of photo-realistic media on the WWW provides a way to visually communicate planning information, its use is limited. The merging of photorealistic media and solid geometric modelling is reviewed in the creation of Augmented Reality. Augmented Reality is seen to provide an important step forward in the ability to quickly and easily visualise urban planning and urban design information.Thirdly, the role of visual communication of planning data through GIS is examined interms of desktop, three dimensional and Internet based GIS systems. The evolution to Internet GIS is seen as a critical component in the development of virtual cities which will allow urban planners and urban designers to visualise and model the complexity of the built environment in networked virtual reality.Finally a viewpoint is put forward of the Virtual City, linking Internet GIS with photorealistic multi-user Virtual Worlds. At present there are constraints on how far virtual cities can be developed, but a view is provided on how these networked virtual worlds are developing to aid visual communication in urban planning and urban design

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this ïŹeld. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    A Study of Interaction, Visual Canvas, and Immersion in AR Design: A DSR Approach

    Get PDF
    Augmented reality (AR) as an innovative technology has changed the way people use technology for interaction and communication. While researchers have studied the application of AR, research on AR as a communication medium remains scant. In this study, we investigate the effect of AR factors (namely, interaction, visual canvas/cues, and immersion) on AR-mediated communication. We apply design science research (DSR) guidelines to design, develop, and evaluate an AR artifact. We derive the design elements based on interactivity, media naturalness, and immersion theories and develop the AR artifact as a mobile app in an iterative manner. We evaluate the design product through the informed arguments and scenarios method, and the design process by assessing its conformance to DSR principles. We show that AR factors\u27 design elements—interaction (user controls, contextual tasks, and ergonomics), visual canvas/cues (realistic 3D models, visual and audio cues, and aesthetics), and immersion (diverse components)—play a critical role in AR-mediated communication. Furthermore, high-quality product visuals and interactive user controls give users a good AR experience. From a practice perspective, AR app designers may incorporate the design process we used in our study and generate AR experiences that fully exploit AR media’s communication affordance. We contribute to knowledge by using DSR guidelines for designing and developing AR as a communication medium

    PainDroid: An android-based virtual reality application for pain assessment

    Get PDF
    Earlier studies in the field of pain research suggest that little efficient intervention currently exists in response to the exponential increase in the prevalence of pain. In this paper, we present an Android application (PainDroid) with multimodal functionality that could be enhanced with Virtual Reality (VR) technology, which has been designed for the purpose of improving the assessment of this notoriously difficult medical concern. Pain- Droid has been evaluated for its usability and acceptability with a pilot group of potential users and clinicians, with initial results suggesting that it can be an effective and usable tool for improving the assessment of pain. Participant experiences indicated that the application was easy to use and the potential of the application was similarly appreciated by the clinicians involved in the evaluation. Our findings may be of considerable interest to healthcare providers, policy makers, and other parties that might be actively involved in the area of pain and VR research
    • 

    corecore