2,428 research outputs found

    Navigating Immersive and Interactive VR Environments With Connected 360° Panoramas

    Get PDF
    Emerging research is expanding the idea of using 360-degree spherical panoramas of real-world environments for use in 360 VR experiences beyond video and image viewing. However, most of these experiences are strictly guided, with few opportunities for interaction or exploration. There is a desire to develop experiences with cohesive virtual environments created with 360 VR that allow for choice in navigation, versus scripted experiences with limited interaction. Unlike standard VR with the freedom of synthetic graphics, there are challenges in designing appropriate user interfaces (UIs) for 360 VR navigation within the limitations of fixed assets. To tackle this gap, we designed RealNodes, a software system that presents an interactive and explorable 360 VR environment. We also developed four visual guidance UIs for 360 VR navigation. The results of a pilot study showed that choice of UI had a significant effect on task completion times, showing one of our methods, Arrow, was best. Arrow also exhibited positive but non-significant trends in average measures with preference, user engagement, and simulator-sickness. RealNodes, the UI designs, and the pilot study results contribute preliminary information that inspire future investigation of how to design effective explorable scenarios in 360 VR and visual guidance metaphors for navigation in applications using 360 VR environments

    Virtual transcendence experiences: Exploring technical and design challenges in multi-sensory environments

    Get PDF
    In this paper 1, we introduce the concept of Virtual Transcendence Experience (VTE) as a response to the interactions of several users sharing several immersive experiences through different media channels. For that, we review the current body of knowledge that has led to the development of a VTE system. This is followed by a discussion of current technical and design challenges that could support the implementation of this concept. This discussion has informed the VTE framework (VTEf), which integrates different layers of experiences, including the role of each user and the technical challenges involved. We conclude this paper with suggestions for two scenarios and recommendations for the implementation of a system that could support VTEs

    Design methodology for 360-degree immersive video applications

    Get PDF
    360-degree immersive video applications for Head Mounted Display (HMD) devices offer great potential in providing engaging forms of experiential media solutions. Design challenges emerge though by this new kind of immersive media due to the 2D form of resources used for their construction, the lack of depth, the limited interaction, and the need to address the sense of presence. In addition, the use of Virtual Reality (VR) is related to cybersickness effects imposing further implications in moderate motion design tasks. This research project provides a systematic methodological approach in addressing those challenges and implications in 360-degree immersive video applications design. By studying and analysing methods and techniques efficiently used in the area of VR and Games design, a rigorous methodological design process is proposed. This process is introduced by the specification of the iVID (Immersive Video Interaction Design) framework. The efficiency of the iVID framework and the design methods and techniques it proposes is evaluated through two phases of user studies. Two different 360-degree immersive video prototypes have been created to serve the studies purposes. The analysis of the purposes of the studies ed to the definition of a set of design guidelines to be followed along with the iVID framework for designing 360-degree video-based experiences that are engaging and immersive

    Leveraging eXtented Reality & Human-Computer Interaction for User Experi- ence in 360◦ Video

    Get PDF
    EXtended Reality systems have resurged as a medium for work and entertainment. While 360o video has been characterized as less immersive than computer-generated VR, its realism, ease of use and affordability mean it is in widespread commercial use. Based on the prevalence and potential of the 360o video format, this research is focused on improving and augmenting the user experience of watching 360o video. By leveraging knowledge from Extented Reality (XR) systems and Human-Computer Interaction (HCI), this research addresses two issues affecting user experience in 360o video: Attention Guidance and Visually Induced Motion Sickness (VIMS). This research work relies on the construction of multiple artifacts to answer the de- fined research questions: (1) IVRUX, a tool for analysis of immersive VR narrative expe- riences; (2) Cue Control, a tool for creation of spatial audio soundtracks for 360o video, as well as enabling the collection and analysis of captured metrics emerging from the user experience; and (3) VIMS mitigation pipeline, a linear sequence of modules (including optical flow and visual SLAM among others) that control parameters for visual modi- fications such as a restricted Field of View (FoV). These artifacts are accompanied by evaluation studies targeting the defined research questions. Through Cue Control, this research shows that non-diegetic music can be spatialized to act as orientation for users. A partial spatialization of music was deemed ineffective when used for orientation. Addi- tionally, our results also demonstrate that diegetic sounds are used for notification rather than orientation. Through VIMS mitigation pipeline, this research shows that dynamic restricted FoV is statistically significant in mitigating VIMS, while mantaining desired levels of Presence. Both Cue Control and the VIMS mitigation pipeline emerged from a Research through Design (RtD) approach, where the IVRUX artifact is the product of de- sign knowledge and gave direction to research. The research presented in this thesis is of interest to practitioners and researchers working on 360o video and helps delineate future directions in making 360o video a rich design space for interaction and narrative.Sistemas de Realidade EXtendida ressurgiram como um meio de comunicação para o tra- balho e entretenimento. Enquanto que o vídeo 360o tem sido caracterizado como sendo menos imersivo que a Realidade Virtual gerada por computador, o seu realismo, facili- dade de uso e acessibilidade significa que tem uso comercial generalizado. Baseado na prevalência e potencial do formato de vídeo 360o, esta pesquisa está focada em melhorar e aumentar a experiência de utilizador ao ver vídeos 360o. Impulsionado por conhecimento de sistemas de Realidade eXtendida (XR) e Interacção Humano-Computador (HCI), esta pesquisa aborda dois problemas que afetam a experiência de utilizador em vídeo 360o: Orientação de Atenção e Enjoo de Movimento Induzido Visualmente (VIMS). Este trabalho de pesquisa é apoiado na construção de múltiplos artefactos para res- ponder as perguntas de pesquisa definidas: (1) IVRUX, uma ferramenta para análise de experiências narrativas imersivas em VR; (2) Cue Control, uma ferramenta para a criação de bandas sonoras de áudio espacial, enquanto permite a recolha e análise de métricas capturadas emergentes da experiencia de utilizador; e (3) canal para a mitigação de VIMS, uma sequência linear de módulos (incluindo fluxo ótico e SLAM visual entre outros) que controla parâmetros para modificações visuais como o campo de visão restringido. Estes artefactos estão acompanhados por estudos de avaliação direcionados para às perguntas de pesquisa definidas. Através do Cue Control, esta pesquisa mostra que música não- diegética pode ser espacializada para servir como orientação para os utilizadores. Uma espacialização parcial da música foi considerada ineficaz quando usada para a orientação. Adicionalmente, os nossos resultados demonstram que sons diegéticos são usados para notificação em vez de orientação. Através do canal para a mitigação de VIMS, esta pesquisa mostra que o campo de visão restrito e dinâmico é estatisticamente significante ao mitigar VIMS, enquanto mantem níveis desejados de Presença. Ambos Cue Control e o canal para a mitigação de VIMS emergiram de uma abordagem de Pesquisa através do Design (RtD), onde o artefacto IVRUX é o produto de conhecimento de design e deu direcção à pesquisa. A pesquisa apresentada nesta tese é de interesse para profissionais e investigadores tra- balhando em vídeo 360o e ajuda a delinear futuras direções em tornar o vídeo 360o um espaço de design rico para a interação e narrativa

    Effects of Character Guide in Immersive Virtual Reality Stories

    Get PDF
    Bringing cinematic experiences from traditional film screens into Virtual Reality (VR) has become an increasingly popular form of entertainment in recent years. VR provides viewers unprecedented film experience that allows them to freely explore around the environment and even interact with virtual props and characters. For the audience, this kind of experience raises their sense of presence in a different world, and may even stimulate their full immersion in story scenarios. However, different from traditional film-making, where the audience is completely passive in following along director’s decisions of storytelling, more freedom in VR might cause viewers to get lost on halfway watching a series of events that build up a story. Therefore, striking a balance between user interaction and narrative progression is a big challenge for filmmakers. To assist in organizing the research space, we presented a media review and the resulting framework to characterize the primary differences among different variations of film, media, games, and VR storytelling. The evaluation in particular provided us with knowledge that were closely associated with story-progression strategies and gaze redirection methods for interactive content in the commercial domain. Following the existing VR storytelling framework, we then approached the problem of guiding the audience through the major events of a story by introducing a virtual character as a travel companion who provides assistance in directing the viewer’s focus to the target scenes. The presented research explored a new technique that allowed a separate virtual character to be overlaid on top of an existing 360-degree video such that the added character react based on the head-tracking data to help indicate to the viewer the core focal content of the story. The motivation behind this research is to assist directors in using a virtual guiding character to increase the effectiveness of VR storytelling, assuring that viewers fully understand the story through completing a sequence of events, and possibly realize a rich literary experience. To assess the effectiveness of this technique, we performed a controlled experiment by applying the method in three immersive narrative experiences, each with a control condition that was free ii from guidance. The experiment compared three variations of the character guide: 1) no guide; 2) a guide with an art style similar to the style of the video design; and 3) a character guide with a dissimilar style. All participants viewed the narrative experiences to test whether a similar art style led to better gaze behaviors that had higher likelihood of falling on the intended focus regions of the 360-degree range of the Virtual Environment (VE). By the end of the experiment, we concluded that adding a virtual character that was independent from the narrative had limited effects on users’ gaze performances when watching an interactive story in VR. Furthermore, the implemented character’s art style made very few difference to users’ gaze performance as well as their level of viewing satisfaction. The primary reason could be due to limitation of the implementation design. Besides this, the guiding body language designed for an animal character caused certain confusion for numerous participants viewing the stories. In the end, the character guide approaches still provided insights for future directors and designers into how to draw the viewers’ attention to a target point within a narrative VE, including what can work well and what should be avoide

    Immersive 360° video for forensic education

    Get PDF
    Throughout the globe, training in the investigation of forensic crime scene work is a vital part of the overall training process within Police Academies and forensic programs throughout the world. However, the exposure of trainee forensic officers to real life scenes, by instructors, is minimal due to the delicate nature of information presented within them and the overall difficulty of Forensic investigations. Virtual Reality (VR) is computer technology utilising headsets, to produce lifelike imageries, sounds and perceptions simulating physical presence inside a virtual setting to a user. The user is able to look around the virtual world and often interact with virtual landscapes or objects. VR headsets are head‐mounted goggles with a screen in front of the eyes (Burdea & Coffet 2003). The use of VR varies widely from personal gaming to classroom learning. Uses also include computerised tools that are used solely online. The current use of VR within Forensic Science is that it is used widely in several capacities that include the training and examination of new forensic officers. However, there is minimal review and authentication of the efficiency of VR use for the teaching of forensic investigation. This is surprising, as the VR field has experienced rapid expansion in the educating of many varying fields over the past few years. Even though VR could enhance forensic training by offering another, perhaps more versatile, engaging way of learning, no devoted VR application has yet been commercially implemented for forensic examination education. Research into VR is a fairly young field, however the technology and use of it is still rapidly growing and the improvement of interactive tools is inevitably having an impact on all facets of learning and teaching

    The power of immersive technologies: a sociopsychological analysis of the relationship between immersive environments, storytelling, sentiment, and the impact on user experience

    Get PDF
    This dissertation initially focused on exploring the potential of immersive technologies for the distant future. However, the emergence of the COVID-19 virus in late 2019 disrupted the world, causing a pause in many areas. Nevertheless, the butterfly effect of the pandemic spurred the development of immersive technologies, resulting in the rise of the metaverse, web3, non-fungible tokens (NFT), and avatars, which are gaining increasing popularity. The excitement for the metaverse is growing in both academia and industry, leading to new avenues of research, digital marketing, video games, tourism, and social media. This dissertation explores this rapidly emerging technological revolution and its effects on user experience (UX)

    Collaborative interaction in immersive 360º experiences

    Get PDF
    Os sistemas de reprodução de vídeo tornaram-se, a cada dia, mais habituais e utilizados. Consequentemente, foram criadas extensões desta tecnologia permitindo colaboração multipessoal de modo a poder assistir remotamente e sincronamente. Exemplos conhecidos são o Watch2gether, Sync Video e Netflix Party, que nos permitem assistir vídeos síncrona e remotamente com amigos. Estas aplicações de visualização conjunta, apesar de bem desenvolvidas, estão limitadas ao típico formato, não se estendendo a vídeos 360. O principal objetivo deste projeto é então expandir a pesquisa nesta área ao desenvolver um sistema colaborativo para vídeos 360. Já foram direcionados vários esforços na área de vídeos 360o, um deles sendo o projeto AV360, aplicação que permite ao utilizador visualizar e editar este tipo de vídeos com anotações e guias. O sistema que se pretende integrar é um seguimento ao AV360, utilizando parte das tecnologias já implementadas. De maneira a compartimentalizar e facilitar a pesquisa são considerados os seguintes temas de forma individual: a visualização de vídeos 360o, a generalidade dos sistemas colaborativos, a aplicação de colaboração em ambientes virtuais e os sistemas de vídeo colaborativos. É importante ter noção das vantagens e desvantagens de assistir a um vídeo 360o, conseguir retirar o que é a essência nestes vídeos e mantê-la, integrando também a inclusão de outros utilizadores. Na escolha das atividades colaborativas a aplicar, é imprescindível analisar o estado em que os sistemas colaborativos se encontram hoje em dia e posteriormente afunilar a pesquisa para a colaboração em ambientes virtuais e em vídeos. Dentro de todos os métodos analisados só os adaptáveis a ambientes imersivos e a vídeos são escolhidos e desenvolvidos neste projeto. Com base numa pesquisa aprofundada sobre o assunto, é criado um sistema de colaboração em vídeos 360o. O software permite que os utilizadores assistam em simultâneo a um vídeo enquanto comunicam de uma forma verbal e não verbal para se expressarem e partilharem a experiência do momento. Este trabalho tem em mente que parte das ideias implementadas possam ser reutilizáveis para outros projetos de experiências imersivas

    Disruptive approaches for subtitling in immersive environments

    Get PDF
    The Immersive Accessibility Project (ImAc) explores how accessibility services can be integrated with 360o video as well as new methods for enabling universal access to immersive content. ImAc is focused on inclusivity and addresses the needs of all users, including those with sensory or learning disabilities, of all ages and considers language and user preferences. The project focuses on moving away from the constraints of existing technologies and explores new methods for creating a personal experience for each consumer. It is not good enough to simply retrofit subtitles into immersive content: this paper attempts to disrupt the industry with new and often controversial methods. This paper provides an overview of the ImAc project and proposes guiding methods for subtitling in immersive environments. We discuss the current state-of-the-art for subtitling in immersive environments and the rendering of subtitles in the user interface within the ImAc project. We then discuss new experimental rendering modes that have been implemented including a responsive subtitle approach, which dynamically re-blocks subtitles to fit the available space and explore alternative rendering techniques where the subtitles are attached to the scene

    Using Immersive Virtual Reality for Student Learning: A Qualitative Case Study

    Get PDF
    ABSTRACT The prominence of virtual reality (VR) in the educational field has grown in recent years due to increased availability and lower costs. I conducted a global study regarding how pioneering K-12 teachers use VR to engage students in learning activities. The purpose of this qualitative case study was to identify how and why teachers used VR for student learning. Fifteen educators from five continents participated in the study. They described their initial VR experiences and how these experiences motivated them to pursue ways to implement VR in their disciplinary fields. I used the video conference tool “Zoom” to conduct interviews. Participants described the “spark” of discovery and recognition of VR for learning. They explained measures to obtain permission, approaches to funding, and the implementation process. Participants developed structures for student learning, transformed physical spaces, and invented pedagogies to ensure positive learning experiences. Participants provided optimal immersive experiences by repurposing content and adopting other applications to achieve learning goals. Three levels of incorporating VR for student learning were identified, including: (1) exploration; (2) acquiring and applying disciplinary knowledge; and (3) content creation and interactive problem solving. The quality of headsets dictated the level(s) of implementation. Dewey’s (1923) experiential learning theories as well as the Technology, Pedagogy, and Content Knowledge framework (TPACK; Mishra & Koehler, 2006) helped to interpret data. Successful implementation requires collaboration and pedagogical modifications and administrative support. This study highlights the successful methods and practices for others considering the implementation of VR for K-12 student learning. Keywords: TPACK, Dewey, Virtual Reality (VR), Innovation, Experiential Learnin
    corecore