18 research outputs found

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Merging the Real and the Virtual: An Exploration of Interaction Methods to Blend Realities

    Get PDF
    We investigate, build, and design interaction methods to merge the real with the virtual. An initial investigation looks at spatial augmented reality (SAR) and its effects on pointing with a real mobile phone. A study reveals a set of trade-offs between the raycast, viewport, and direct pointing techniques. To further investigate the manipulation of virtual content within a SAR environment, we design an interaction technique that utilizes the distance that a user holds mobile phone away from their body. Our technique enables pushing virtual content from a mobile phone to an external SAR environment, interact with that content, rotate-scale-translate it, and pull the content back into the mobile phone. This is all done in a way that ensures seamless transitions between the real environment of the mobile phone and the virtual SAR environment. To investigate the issues that occur when the physical environment is hidden by a fully immersive virtual reality (VR) HMD, we design and investigate a system that merges a realtime 3D reconstruction of the real world with a virtual environment. This allows users to freely move, manipulate, observe, and communicate with people and objects situated in their physical reality without losing their sense of immersion or presence inside a virtual world. A study with VR users demonstrates the affordances provided by the system and how it can be used to enhance current VR experiences. We then move to AR, to investigate the limitations of optical see-through HMDs and the problem of communicating the internal state of the virtual world with unaugmented users. To address these issues and enable new ways to visualize, manipulate, and share virtual content, we propose a system that combines a wearable SAR projector. Demonstrations showcase ways to utilize the projected and head-mounted displays together, such as expanding field of view, distributing content across depth surfaces, and enabling bystander collaboration. We then turn to videogames to investigate how spectatorship of these virtual environments can be enhanced through expanded video rendering techniques. We extract and combine additional data to form a cumulative 3D representation of the live game environment for spectators, which enables each spectator to individually control a personal view into the stream while in VR. A study shows that users prefer spectating in VR when compared with a comparable desktop rendering

    Modelling, Test and Practice of Steel Structures

    Get PDF
    This reprint provides an international forum for the presentation and discussion of the latest developments in structural-steel research and its applications. The topics of this reprint include the modelling, testing and practice of steel structures and steel-based composite structures. A total of 17 high-quality, original papers dealing with all aspects of steel-structures research, including modelling, testing, and construction research on material properties, components, assemblages, connection, and structural behaviors, are included for publication

    Immersive Participation:Futuring, Training Simulation and Dance and Virtual Reality

    Get PDF
    Dance knowledge can inform the development of scenario design in immersive digital simulation environments by strengthening a participant’s capacity to learn through the body. This study engages with processes of participatory practice that question how the transmission and transfer of dance knowledge/embodied knowledge in immersive digital environments is activated and applied in new contexts. These questions are relevant in both arts and industry and have the potential to add value and knowledge through crossdisciplinary collaboration and exchange. This thesis consists of three different research projects all focused on observation, participation, and interviews with experts on embodiment in digital simulation. The projects were chosen to provide a range of perspectives across dance, industry and futures studies. Theories of embodied cognition, in particular the notions of the extended body, distributed cognition, enactment and mindfulness, offer critical lenses through which to explore the relationship of embodied integration and participation within immersive digital environments. These areas of inquiry lead to the consideration of how language from the field of computer science can assist in describing somatic experience in digital worlds through a discussion of the emerging concepts of mindfulness, wayfinding, guided movement and digital kinship. These terms serve as an example of how the mutability of language became part of the process as terms applied in disparate disciplines were understood within varying contexts. The analytic tools focus on applying a posthuman view, speculation through a futures ethnography, and a cognitive ethnographical approach to my research project. These approaches allowed me to examine an ecology of practices in order to identify methods and processes that can facilitate the transmission and transfer of embodied knowledge within a community of practice. The ecological components include dance, healthcare, transport, education and human/computer interaction. These fields drove the data collection from a range of sources including academic papers, texts, specialists’ reports, scientific papers, interviews and conversations with experts and artists.The aim of my research is to contribute both a theoretical and a speculative understanding of processes, as well as tools applicable in the transmission of embodied knowledge in virtual dance and arts environments as well as digital simulation across industry. Processes were understood theoretically through established studies in embodied cognition applied to workbased training, reinterpreted through my own movement study. Futures methodologies paved the way for speculative processes and analysis. Tools to choreograph scenario design in immersive digital environments were identified through the recognition of cross purpose language such as mindfulness, wayfinding, guided movement and digital kinship. Put together, the major contribution of this research is a greater understanding of the value of dance knowledge applied to simulation developed through theoretical and transformational processes and creative tools

    Interaction Illustration Taxonomy: Classification of Styles and Techniques for Visually Representing Interaction Scenarios

    Get PDF
    International audienceStatic illustrations are ubiquitous means to represent interaction scenarios. Across papers and reports, these visuals demonstrate people's use of devices, explain systems, or show design spaces. Creating such figures is challenging, and very little is known about the overarching strategies for visually representing interaction scenarios. To mitigate this task, we contribute a unified taxonomy of design elements that compose such figures. In particular, we provide a detailed classification of Structural and Interaction strategies, such as composition, visual techniques, dynamics, representation of users, and many others-all in context of the type of scenarios. This taxonomy can inform researchers' choices when creating new figures, by providing a concise synthesis of visual strategies, and revealing approaches they were not aware of before. Furthermore, to support the community for creating further taxonomies, we also provide three open-source software facilitating the coding process and visual exploration of the coding scheme

    Enhanced Virtuality: Increasing the Usability and Productivity of Virtual Environments

    Get PDF
    Mit stetig steigender Bildschirmauflösung, genauerem Tracking und fallenden Preisen stehen Virtual Reality (VR) Systeme kurz davor sich erfolgreich am Markt zu etablieren. Verschiedene Werkzeuge helfen Entwicklern bei der Erstellung komplexer Interaktionen mit mehreren Benutzern innerhalb adaptiver virtueller Umgebungen. Allerdings entstehen mit der Verbreitung der VR-Systeme auch zusätzliche Herausforderungen: Diverse Eingabegeräte mit ungewohnten Formen und Tastenlayouts verhindern eine intuitive Interaktion. Darüber hinaus zwingt der eingeschränkte Funktionsumfang bestehender Software die Nutzer dazu, auf herkömmliche PC- oder Touch-basierte Systeme zurückzugreifen. Außerdem birgt die Zusammenarbeit mit anderen Anwendern am gleichen Standort Herausforderungen hinsichtlich der Kalibrierung unterschiedlicher Trackingsysteme und der Kollisionsvermeidung. Beim entfernten Zusammenarbeiten wird die Interaktion durch Latenzzeiten und Verbindungsverluste zusätzlich beeinflusst. Schließlich haben die Benutzer unterschiedliche Anforderungen an die Visualisierung von Inhalten, z.B. Größe, Ausrichtung, Farbe oder Kontrast, innerhalb der virtuellen Welten. Eine strikte Nachbildung von realen Umgebungen in VR verschenkt Potential und wird es nicht ermöglichen, die individuellen Bedürfnisse der Benutzer zu berücksichtigen. Um diese Probleme anzugehen, werden in der vorliegenden Arbeit Lösungen in den Bereichen Eingabe, Zusammenarbeit und Erweiterung von virtuellen Welten und Benutzern vorgestellt, die darauf abzielen, die Benutzerfreundlichkeit und Produktivität von VR zu erhöhen. Zunächst werden PC-basierte Hardware und Software in die virtuelle Welt übertragen, um die Vertrautheit und den Funktionsumfang bestehender Anwendungen in VR zu erhalten. Virtuelle Stellvertreter von physischen Geräten, z.B. Tastatur und Tablet, und ein VR-Modus für Anwendungen ermöglichen es dem Benutzer reale Fähigkeiten in die virtuelle Welt zu übertragen. Des Weiteren wird ein Algorithmus vorgestellt, der die Kalibrierung mehrerer ko-lokaler VR-Geräte mit hoher Genauigkeit und geringen Hardwareanforderungen und geringem Aufwand ermöglicht. Da VR-Headsets die reale Umgebung der Benutzer ausblenden, wird die Relevanz einer Ganzkörper-Avatar-Visualisierung für die Kollisionsvermeidung und das entfernte Zusammenarbeiten nachgewiesen. Darüber hinaus werden personalisierte räumliche oder zeitliche Modifikationen vorgestellt, die es erlauben, die Benutzerfreundlichkeit, Arbeitsleistung und soziale Präsenz von Benutzern zu erhöhen. Diskrepanzen zwischen den virtuellen Welten, die durch persönliche Anpassungen entstehen, werden durch Methoden der Avatar-Umlenkung (engl. redirection) kompensiert. Abschließend werden einige der Methoden und Erkenntnisse in eine beispielhafte Anwendung integriert, um deren praktische Anwendbarkeit zu verdeutlichen. Die vorliegende Arbeit zeigt, dass virtuelle Umgebungen auf realen Fähigkeiten und Erfahrungen aufbauen können, um eine vertraute und einfache Interaktion und Zusammenarbeit von Benutzern zu gewährleisten. Darüber hinaus ermöglichen individuelle Erweiterungen des virtuellen Inhalts und der Avatare Einschränkungen der realen Welt zu überwinden und das Erlebnis von VR-Umgebungen zu steigern

    Promoting Reality Awareness in Virtual Reality through Proxemics

    Get PDF
    Head-Mounted Virtual reality (VR) systems provide full-immersive experiences to users and completely isolate them from the outside world, placing them in unsafe situations. Existing research proposed different alert-based solutions to address this. Our work builds on these studies on notification systems for VR environments from a different perspective. We focus on: (i) exploring alert systems to notify VR users about non-immersed bystanders' in socially related, non-critical interaction contexts; (ii) understanding how best to provide awareness of non-immersed bystanders while maintaining presence and immersion within the Virtual Environment(VE). To this end, we developed single and combined alert cues - leveraging proxemics, perception channels, and push/pull approaches and evaluated those via two user studies. Our findings indicate a strong preference towards maintaining immersion and combining audio and visual cues, push and pull notification techniques that evolve dynamically based on proximity

    Promoting reality awareness in virtual reality through proxemics

    Get PDF
    Head-Mounted Virtual reality (VR) systems provide full-immersive experiences to users and completely isolate them from the outside world, placing them in unsafe situations. Existing research proposed different alert-based solutions to address this. Our work builds on these studies on notification systems for VR environments from a different perspective. We focus on: (i) exploring alert systems to notify VR users about non-immersed bystanders' in socially related, non-critical interaction contexts; (ii) understanding how best to provide awareness of non-immersed bystanders while maintaining presence and immersion within the Virtual Environment(VE). To this end, we developed single and combined alert cues - leveraging proxemics, perception channels, and push/pull approaches and evaluated those via two user studies. Our findings indicate a strong preference towards maintaining immersion and combining audio and visual cues, push and pull notification techniques that evolve dynamically based on proximity

    Development of an Augmented Reality musical instrument

    Get PDF
    Nowadays, Augmented Reality and Virtual Reality are concepts of which people are becoming more and more aware of due to their application to the video-game industry (speceially in the case of VR). Such raise is partly due to a decrease in costs of Head Mounted Displays, which are consequently becoming more and more accessible to the public and developers worldwide. All of these novelties, along with the frenetic development of Information Technologies applied to essentially, all markets; have also made digital artists and manufacturers aware of the never-ending interaction possibilities these paradigms provide and a variety of systems have appeared, which offer innovative creative capabilities. Due to the personal interest of the author in music and the technologies surrounding its creation by digital means, this document covers the application of the Virtuality- Reality-Continuum (VR and AR) paradigms to the field of interfaces for the musical expression. More precisely, it covers the development of an electronic drumset which integrates Arduino-compatible hardware with a 3D visualisation application (developed based on Unity) to create a complete functioning instrument musical instrument, The system presented along the document attempts to leverage three-dimensional visual feedback with tangible interaction based on hitting, which is directly translated to sound and visuals in the sound generation application. Furthermore, the present paper provides a notably deep study of multiple technologies and areas that are ultimately applied to the target system itself. Hardware concerns, time requirements, approaches to the creation of NIMEs (New Interfaces for Musical Expression), Virtual Musical Instrument (VMI) design, musical-data transmission protocols (MIDI and OSC) and 3D modelling constitute the fundamental topics discussed along the document. At the end of this paper, conclusions reflect on the difficulties found along the project, the unfulfilled objectives and all deviations from the initial concept that the project suffered during the development process. Besides, future work paths will be listed and depicted briefly and personal comments will be included as well as humble pieces of advice targeted at readers interested in facing an ambitious project on their own.En la actualidad, los conceptos de Realidad Aumentada (AR) y Realidad Virtual (VR) son cada vez más conocidos por la gente de a pie, debido en gran parte a su aplicación al ámbito de los videojuegos, donde el desarollo para dispositivos HMDs está en auge. Esta popularidad se debe en gran parte al abaratamiento de este tipo de dispositivos, los cuales son cada vez más accesibles al público y a los desarrolladores de todo el mundo. Todas estas novedades sumadas al frenético desarrollo de la industria de IT han llamado la atención de artistas y empresas que han visto en estos paradigmas (VR and AR) una oportunidad para proporcionar nuevas e ilimitadas formas de interacción y creación de arte en alguna de sus formas. Debido al interés personal del autor de este TFG en la música y las tecnologías que posiblitan la creación musical por medios digitales, este documento explora la aplicación de los paradigmas del Virtuality-Reality Continuum de Milgram (AR y VR) al ámbito de las interfaces para la creación musical. Concretamente, este TFG detalla el desarrollo de una batería electrónica, la cual combina una interfaz tangible creada con hardware compatible con Arduino con una aplicación de generación de sonidos y visualización, desarrollada utilizando Unity como base. Este sistema persigue lograr una interacción natural por parte del usuario por medio de integrar el hardware en unas baquetas, las cuales permiten detectar golpes a cualquier tipo de superficie y convierten estos en mensajes MIDI que son utilizados por el sistema generador de sonido para proporcionar feedback al usuario (tanto visual como auditivo); por tanto, este sistema se distingue por abogar por una interacción que permita golpear físicamente objetos (e.g. una cama), mientras que otros sistemas similates basan su modo de interacción en “air-drumming”. Además, este sistema busca solventar algunos de los inconvenientes principales asociados a los baterías y su normalmente conflictivo instrumento, como es el caso de las limitaciones de espacio, la falta de flexibilidad en cuanto a los sonidos que pueden ser generados y el elevado coste del equipo. Por otro lado, este documento pormenoriza diversos aspectos relacionados con el sistema descrito en cuestión, proporcionando al lector una completa panorámica de sistemas similares al propuesto. Asimismo, se describen los aspectos más importantes en relación al desarrollo del TFG, como es el caso de protocolos de transmisión de información musical (MIDI y OSC), algoritmos de control, guías de diseño para interfaces de creación musical (NIMEs) y modelado 3D. Se incluye un íntegro proceso de Ingeniería de Software para mantener la formalidad y tratar de garantizar un desarrollo más organizado y se discute la metodología utilizada para este proceso. Por último, este documento reflexiona sobre las dificultades encontradas, se enumeran posibilidades de Trabajo Futuro y se finaliza con algunas conclusiones personales derivadas de este trabajo de investigación.Ingeniería Informátic
    corecore