22,364 research outputs found

    Hyperion: A 3D Visualization Platform for Optical Design of Folded Systems

    Get PDF
    Hyperion is a 3D visualization platform for optical design. It provides a fully immersive, intuitive, and interactive 3D user experience by leveraging existing AR/VR technologies. It enables the visualization of models of folded freeform optical systems in a dynamic 3D environment. The frontend user experience is supported by the computational ray-tracing engine of Eikonal+, an optical design research software currently being developed. We have built a cross-platform light-weight version of Eikonal+ that can communicate with any user interface or other scientific software. We have also demonstrated a prototype of the Hyperion 3D user experience using a Hololens AR display.Keywords — Unreal Engine, panoramic video, games, Cinematography, Lighting, Composition, VR

    Presence and rehabilitation: toward second-generation virtual reality applications in neuropsychology

    Get PDF
    Virtual Reality (VR) offers a blend of attractive attributes for rehabilitation. The most exploited is its ability to create a 3D simulation of reality that can be explored by patients under the supervision of a therapist. In fact, VR can be defined as an advanced communication interface based on interactive 3D visualization, able to collect and integrate different inputs and data sets in a single real-like experience. However, "treatment is not just fixing what is broken; it is nurturing what is best" (Seligman & Csikszentmihalyi). For rehabilitators, this statement supports the growing interest in the influence of positive psychological state on objective health care outcomes. This paper introduces a bio-cultural theory of presence linking the state of optimal experience defined as "flow" to a virtual reality experience. This suggests the possibility of using VR for a new breed of rehabilitative applications focused on a strategy defined as transformation of flow. In this view, VR can be used to trigger a broad empowerment process within the flow experience induced by a high sense of presence. The link between its experiential and simulative capabilities may transform VR into the ultimate rehabilitative device. Nevertheless, further research is required to explore more in depth the link between cognitive processes, motor activities, presence and flow

    Evaluation of haptic virtual reality user interfaces for medical marking on 3D models

    Get PDF
    Three-dimensional (3D) visualization has been widely used in computer-aided medical diagnosis and planning. To interact with 3D models, current user interfaces in medical systems mainly rely on the traditional 2D interaction techniques by employing a mouse and a 2D display. There are promising haptic virtual reality (VR) interfaces which can enable intuitive and realistic 3D interaction by using VR equipment and haptic devices. However, the practical usability of the haptic VR interfaces in this medical field remains unexplored. In this study, we propose two haptic VR interfaces, a vibrotactile VR interface and a kinesthetic VR interface, for medical diagnosis and planning on volumetric medical images. The vibrotactile VR interface used a head-mounted VR display as the visual output channel and a VR controller with vibrotactile feedback as the manipulation tool. Similarly, the kinesthetic VR interface used a head-mounted VR display as the visual output channel and a kinesthetic force-feedback device as the manipulation tool. We evaluated these two VR interfaces in an experiment involving medical marking on 3D models, by comparing them with the present state-of-the-art 2D interface as the baseline. The results showed that the kinesthetic VR interface performed the best in terms of marking accuracy, whereas the vibrotactile VR interface performed the best in terms of task completion time. Overall, the participants preferred to use the kinesthetic VR interface for the medical task.acceptedVersionPeer reviewe

    VRMoViAn - An Immersive Data Annotation Tool for Visual Analysis of Human Interactions in VR

    Get PDF
    Understanding human behavior in virtual reality (VR) is a key component for developing intelligent systems to enhance human focused VR experiences. The ability to annotate human motion data proves to be a very useful way to analyze and understand human behavior. However, due to the complexity and multi-dimensionality of human activity data, it is necessary to develop software that can display the data in a comprehensible way and can support intuitive data annotation for developing machine learning models able recognize and assist human motions in VR (e.g., remote physical therapy). Although past research has been done to improve VR data visualization, no emphasis has been put into VR data annotation specifically for future machine learning applications. To fill this gap, we have developed a data annotation tool capable of displaying complex VR data in an expressive 3D animated format as well as providing an easily-understandable user interface that allows users to annotate and label human activity efficiently. Specifically, it can convert multiple motion data files into a watchable 3D video, and effectively demonstrate body motion: including eye tracking of the player in VR using animations as well as showcasing hand-object interactions with level-of-detail visualization features. The graphical user interface allows the user to interact and annotate VR data just like they do with other video playback tools. Our next step is to develop and integrate machine learning based clusters to automate data annotation. A user study is being planned to evaluate the tool in terms of user-friendliness and effectiveness in assisting with visualizing and analyzing human behavior along with the ability to easily and accurately annotate real-world datasets

    Virtual reality for 3D histology: multi-scale visualization of organs with interactive feature exploration

    Get PDF
    Virtual reality (VR) enables data visualization in an immersive and engaging manner, and it can be used for creating ways to explore scientific data. Here, we use VR for visualization of 3D histology data, creating a novel interface for digital pathology. Our contribution includes 3D modeling of a whole organ and embedded objects of interest, fusing the models with associated quantitative features and full resolution serial section patches, and implementing the virtual reality application. Our VR application is multi-scale in nature, covering two object levels representing different ranges of detail, namely organ level and sub-organ level. In addition, the application includes several data layers, including the measured histology image layer and multiple representations of quantitative features computed from the histology. In this interactive VR application, the user can set visualization properties, select different samples and features, and interact with various objects. In this work, we used whole mouse prostates (organ level) with prostate cancer tumors (sub-organ objects of interest) as example cases, and included quantitative histological features relevant for tumor biology in the VR model. Due to automated processing of the histology data, our application can be easily adopted to visualize other organs and pathologies from various origins. Our application enables a novel way for exploration of high-resolution, multidimensional data for biomedical research purposes, and can also be used in teaching and researcher training

    Stereoscopic bimanual interaction for 3D visualization

    Get PDF
    Virtual Environments (VE) are being widely used in various research fields for several decades such as 3D visualization, education, training and games. VEs have the potential to enhance the visualization and act as a general medium for human-computer interaction (HCI). However, limited research has evaluated virtual reality (VR) display technologies, monocular and binocular depth cues, for human depth perception of volumetric (non-polygonal) datasets. In addition, a lack of standardization of three-dimensional (3D) user interfaces (UI) makes it challenging to interact with many VE systems. To address these issues, this dissertation focuses on evaluation of effects of stereoscopic and head-coupled displays on depth judgment of volumetric dataset. It also focuses on evaluation of a two-handed view manipulation techniques which support simultaneous 7 degree-of-freedom (DOF) navigation (x,y,z + yaw,pitch,roll + scale) in a multi-scale virtual environment (MSVE). Furthermore, this dissertation evaluates auto-adjustment of stereo view parameters techniques for stereoscopic fusion problems in a MSVE. Next, this dissertation presents a bimanual, hybrid user interface which combines traditional tracking devices with computer-vision based "natural" 3D inputs for multi-dimensional visualization in a semi-immersive desktop VR system. In conclusion, this dissertation provides a guideline for research design for evaluating UI and interaction techniques

    A Low Cost Virtual Reality Human Computer Interface for CAD Model Manipulation

    Get PDF
    Interactions with high volume complex three-dimensional data using traditional two-dimensional computer interfaces have, historically, been inefficient and restrictive. However, during the past decade, virtual reality (VR) has presented a new paradigm for human-computer interaction. This paper presents a VR human-computer interface system, which aims at providing a solution to the human-computer interaction problems present in today’s computer-aided design (CAD) software applications. A data glove device is used as a 3D interface for CAD model manipulation in a virtual design space. To make the visualization more realistic, real-time active stereo vision is provided using LCD shutter glasses. To determine the ease of use and intuitiveness of the interface, a human subject study was conducted for performing standard CAD manipulation tasks. Analysis results and technical issues are also presented and discussed

    implementation of tactile sensors on a 3 fingers robotiq adaptive gripper and visualization in vr using arduino controller

    Get PDF
    Abstract Tactile sensors are essential components for the implementation of complex manipulation tasks using robot grippers, allowing to directly control the grasping force according to the object properties. Virtual Reality represents an effective tool capable of visualizing complex systems in full details and with a high level of interactivity. After the implementation of cost-effective tactile arrays on a 3-finger Robotiq ® gripper using an ARDUINO controller, it is presented an innovative VR interface capable of visualizing the pressure values at the fingertips in a 3D environment, providing an effective tool aimed at supporting the programming and the visualization of the gripper VR

    Simulation and Visualization of Thermal Metaphor in a Virtual Environment for Thermal Building Assessment

    Get PDF
    La référence est présente sur HAL mais est incomplète (il manque les co-auteurs et le fichier pdf).The current application of the design process through energy efficiency in virtual reality (VR) systems is limited mostly to building performance predictions, as the issue of the data formats and the workflow used for 3D modeling, thermal calculation and VR visualization. The importance of energy efficiency and integration of advances in building design and VR technology have lead this research to focus on thermal simulation results visualized in a virtual environment to optimize building design, particularly concerning heritage buildings. The emphasis is on the representation of thermal data of a room simulated in a virtual environment (VE) in order to improve the ways in which thermal analysis data are presented to the building stakeholder, with the aim of increasing accuracy and efficiency. The approach is to present more immersive thermal simulation and to project the calculation results in projective displays particularly in Immersion room (CAVE-like). The main idea concerning the experiment is to provide an instrument of visualization and interaction concerning the thermal conditions in a virtual building. Thus the user can immerge, interact, and perceive the impact of the modifications generated by the system, regarding the thermal simulation results. The research has demonstrated it is possible to improve the representation and interpretation of building performance data, particularly for thermal results using visualization techniques.Direktorat Riset dan Pengabdian Masyarakat (DRPM) Universitas Indonesia Research Grant No. 2191/H2.R12/HKP.05.00/201
    • …
    corecore