30,632 research outputs found

    Designing and evaluating avatar biosignal visualization techniques in social virtual reality

    Get PDF
    Social VR is the application of virtual reality that supports remote social interaction in virtual spaces. Users communicate and interact with others in the social VR environment through avatars, which are virtual anthropomorphic characters that aim to represent humans in virtual worlds. In addition, the development of the HMD and commercially available motion capture systems enable the avatars in the virtual environment to detect and reflect the real-time motions, even facial expressions of people. However, the avatars still lack an indication of biofeedback - e.g., body temperature, breathing, heart rate, muscle contraction -, which serves as social cues for communication in reality. While some features, for example, emojis, supports users to express their feeling or emotions for richer communication, the missing information often results in miscommunication in the virtual space. It remains a barrier to a fully immersed experience in the social VR space. This project proposes a concept of visualizing biosignals of the avatars in the social virtual reality space for a richer-level interaction in virtual reality. With the technologies available to capture and reflect accurate biofeedback in real-time, we would like to explore ways and possibilities to map the bio states of the users in reality to avatars in the virtual world. The project starts with conducting user researches to understand the current user behaviors in the social VR spaces and their perspectives on sharing biosignals. Based on the requirements gathered from the user study, the scope of the project is narrowed down to a ‘watching entertainment’ scenario, and the ways to visualize biosignals on avatars were explored through a co-design session with designers. After that, four biosignal visualization techniques in two biosignals - heart rate and breathing rate - are prototyped under the VR jazz bar setting. Finally, the user study is conducted with 16 pairs (32 participants in total) to test and compare the effects of each biosignal visualization technique in watching entertainment scenarios with a companion. As a result, the embodied visualizations are the most understandable and least distracting visualization method among the four methods. Furthermore, the limitations of the research, recommendations on biosignal visualizations, and recommendations on conducting design research are provided

    VRMoViAn - An Immersive Data Annotation Tool for Visual Analysis of Human Interactions in VR

    Get PDF
    Understanding human behavior in virtual reality (VR) is a key component for developing intelligent systems to enhance human focused VR experiences. The ability to annotate human motion data proves to be a very useful way to analyze and understand human behavior. However, due to the complexity and multi-dimensionality of human activity data, it is necessary to develop software that can display the data in a comprehensible way and can support intuitive data annotation for developing machine learning models able recognize and assist human motions in VR (e.g., remote physical therapy). Although past research has been done to improve VR data visualization, no emphasis has been put into VR data annotation specifically for future machine learning applications. To fill this gap, we have developed a data annotation tool capable of displaying complex VR data in an expressive 3D animated format as well as providing an easily-understandable user interface that allows users to annotate and label human activity efficiently. Specifically, it can convert multiple motion data files into a watchable 3D video, and effectively demonstrate body motion: including eye tracking of the player in VR using animations as well as showcasing hand-object interactions with level-of-detail visualization features. The graphical user interface allows the user to interact and annotate VR data just like they do with other video playback tools. Our next step is to develop and integrate machine learning based clusters to automate data annotation. A user study is being planned to evaluate the tool in terms of user-friendliness and effectiveness in assisting with visualizing and analyzing human behavior along with the ability to easily and accurately annotate real-world datasets

    Tadpole VR: virtual reality visualization of a simulated tadpole spinal cord

    Get PDF
    Recent advances in “developmental” approach (combining experimental study with computational modelling) of neural networks produces increasingly large data sets, in both complexity and size. This poses a significant challenge in analyzing, visualizing and understanding not only the spatial structure but also the behavior of such networks. This paper describes a Virtual Reality application for visualization of two biologically accurate computational models that model the anatomical structure of a neural network comprised of 1,500 neurons and over 80,000 connections. The visualization enables a user to observe the complex spatio-temporal interplay between seven unique types of neurons culminating in an observable swimming pattern. We present a detailed description of the design approach for the virtual environment, based on a set of initial requirements, followed up by the implementation and optimization steps. Lastly, the results of a pilot usability study are being presented on how confident participants are in their ability to understand how the alternating firing pattern between the two sides of the tadpole’s body generate swimming motion

    Impact of model fidelity in factory layout assessment using immersive discrete event simulation

    Get PDF
    Discrete Event Simulation (DES) can help speed up the layout design process. It offers further benefits when combined with Virtual Reality (VR). The latest technology, Immersive Virtual Reality (IVR), immerses users in virtual prototypes of their manufacturing plants to-be, potentially helping decision-making. This work seeks to evaluate the impact of visual fidelity, which refers to the degree to which objects in VR conforms to the real world, using an IVR visualisation of the DES model of an actual shop floor. User studies are performed using scenarios populated with low- and high-fidelity models. Study participant carried out four tasks representative of layout decision-making. Limitations of existing IVR technology was found to cause motion sickness. The results indicate with the particular group of naïve modellers used that there is no significant difference in benefits between low and high fidelity, suggesting that low fidelity VR models may be more cost-effective for this group

    The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies

    Get PDF
    The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future

    Exploring the Potential of 3D Visualization Techniques for Usage in Collaborative Design

    Get PDF
    Best practice for collaborative design demands good interaction between its collaborators. The capacity to share common knowledge about design models at hand is a basic requirement. With current advancing technologies gathering collective knowledge is more straightforward, as the dialog between experts can be supported better. The potential for 3D visualization techniques to become the right support tool for collaborative design is explored. Special attention is put on the possible usage for remote collaboration. The opportunities for current state-of-the-art visualization techniques from stereoscopic vision to holographic displays are researched. A classification of the various systems is explored with respect to their tangible usage for augmented reality. Appropriate interaction methods can be selected based on the usage scenario

    Improving Big Data Visual Analytics with Interactive Virtual Reality

    Full text link
    For decades, the growth and volume of digital data collection has made it challenging to digest large volumes of information and extract underlying structure. Coined 'Big Data', massive amounts of information has quite often been gathered inconsistently (e.g from many sources, of various forms, at different rates, etc.). These factors impede the practices of not only processing data, but also analyzing and displaying it in an efficient manner to the user. Many efforts have been completed in the data mining and visual analytics community to create effective ways to further improve analysis and achieve the knowledge desired for better understanding. Our approach for improved big data visual analytics is two-fold, focusing on both visualization and interaction. Given geo-tagged information, we are exploring the benefits of visualizing datasets in the original geospatial domain by utilizing a virtual reality platform. After running proven analytics on the data, we intend to represent the information in a more realistic 3D setting, where analysts can achieve an enhanced situational awareness and rely on familiar perceptions to draw in-depth conclusions on the dataset. In addition, developing a human-computer interface that responds to natural user actions and inputs creates a more intuitive environment. Tasks can be performed to manipulate the dataset and allow users to dive deeper upon request, adhering to desired demands and intentions. Due to the volume and popularity of social media, we developed a 3D tool visualizing Twitter on MIT's campus for analysis. Utilizing emerging technologies of today to create a fully immersive tool that promotes visualization and interaction can help ease the process of understanding and representing big data.Comment: 6 pages, 8 figures, 2015 IEEE High Performance Extreme Computing Conference (HPEC '15); corrected typo
    corecore