93,384 research outputs found

    Integrated Data Visualization and Virtual Reality Tool

    Get PDF
    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system

    Visualization of Post-Processed CFD Data in a Virtual Environment

    Get PDF
    This paper discusses the development of a virtual reality (VR) interface for the visualization of Computational Fluid Dynamics (CFD) data. The application, VR-CFD, provides an immersive and interactive graphical environment in which users can examine the analysis results from a CFD analysis of a flow field in three-dimensional space. It has been tested and implemented with virtual reality devices such as the C2, head mounted display (HMD) and desktop VR. The application is designed to read PLOT3D structured grid data and to display the flow field parameters using features such as streamlines, cutting planes, iso-surfaces, rakes, vector fields and scalar fields. Visualization Toolkit (VTK), a data visualization library, is used along with OpenGL and the C2 VR interface libraries, to develop the application. Analysts and designers have used VRCFD to visualize and understand complex three-dimensional fluid flow phenomena. The combination of three-dimensional interaction capability and the C2 virtual reality environment has been shown to facilitate collaborative discussions between analysts and engineers concerning the appropriateness of the CFD model and the characteristics of the fluid flow

    Deep Learning Development Environment in Virtual Reality

    Full text link
    Virtual reality (VR) offers immersive visualization and intuitive interaction. We leverage VR to enable any biomedical professional to deploy a deep learning (DL) model for image classification. While DL models can be powerful tools for data analysis, they are also challenging to understand and develop. To make deep learning more accessible and intuitive, we have built a virtual reality-based DL development environment. Within our environment, the user can move tangible objects to construct a neural network only using their hands. Our software automatically translates these configurations into a trainable model and then reports its resulting accuracy on a test dataset in real-time. Furthermore, we have enriched the virtual objects with visualizations of the model's components such that users can achieve insight about the DL models that they are developing. With this approach, we bridge the gap between professionals in different fields of expertise while offering a novel perspective for model analysis and data interaction. We further suggest that techniques of development and visualization in deep learning can benefit by integrating virtual reality

    Simulation and Visualization of Thermal Metaphor in a Virtual Environment for Thermal Building Assessment

    Get PDF
    La référence est présente sur HAL mais est incomplète (il manque les co-auteurs et le fichier pdf).The current application of the design process through energy efficiency in virtual reality (VR) systems is limited mostly to building performance predictions, as the issue of the data formats and the workflow used for 3D modeling, thermal calculation and VR visualization. The importance of energy efficiency and integration of advances in building design and VR technology have lead this research to focus on thermal simulation results visualized in a virtual environment to optimize building design, particularly concerning heritage buildings. The emphasis is on the representation of thermal data of a room simulated in a virtual environment (VE) in order to improve the ways in which thermal analysis data are presented to the building stakeholder, with the aim of increasing accuracy and efficiency. The approach is to present more immersive thermal simulation and to project the calculation results in projective displays particularly in Immersion room (CAVE-like). The main idea concerning the experiment is to provide an instrument of visualization and interaction concerning the thermal conditions in a virtual building. Thus the user can immerge, interact, and perceive the impact of the modifications generated by the system, regarding the thermal simulation results. The research has demonstrated it is possible to improve the representation and interpretation of building performance data, particularly for thermal results using visualization techniques.Direktorat Riset dan Pengabdian Masyarakat (DRPM) Universitas Indonesia Research Grant No. 2191/H2.R12/HKP.05.00/201

    GENE EXPRESSION PROSPECTIVE SIMULATION AND ANALYSIS USING DATA MINING AND IMMERSIVE VIRTUAL REALITY VISUALIZATION

    Get PDF
    Biological exploration on genetic expression and protein synthesis in living organisms is used to discover causal and interactive relationships in biological processes. Current GeneChip microarray technology provides a platform to an- alyze up to 500,000 molecular reactions on a single chip, providing thousands of genetic and protein expression results per test. Using visualization tools and priori knowledge of genetic and protein interactions, visual networks are used to model and analyze the results. The virtual reality environment designed and implemented for this project provides visualization and data modeling tools commonly used in genetic ex- pression data analysis. The software processes normalized genetic profile data from microarray testing results and association information from protein-to- protein databases. The data is modeled using a network of nodes to represent data points and edges to show relationships. This information is visualized in virtual reality and modeled using force directed networking algorithms in a fully explorable environment

    A comparative study using an autostereoscopic display with augmented and virtual reality

    Full text link
    Advances in display devices are facilitating the integration of stereoscopic visualization in our daily lives. However, autostereoscopic visualization has not been extensively exploited. In this paper, we present a system that combines Augmented Reality (AR) and autostereoscopic visualization. We also present the first study that compares different aspects using an autostereoscopic display with AR and VR, in which 39 children from 8 to 10 years old participated. In our study, no statistically significant differences were found between AR and VR. However, the scores were very high in nearly all of the questions, and the children also scored the AR version higher in all cases. Moreover, the children explicitly preferred the AR version (81%). For the AR version, a strong and significant correlation was found between the use of the autostereoscopic screen in games and seeing the virtual object on the marker. For the VR version, two strong and significant correlations were found. The first correlation was between the ease of play and the use of the rotatory controller. The second correlation was between depth perception and the game global score. Therefore, the combinations of AR and VR with autostereoscopic visualization are possibilities for developing edutainment systems for childrenThis work was funded by the Spanish APRENDRA project (TIN2009-14319-C02). We would like to thank the following for their contributions: AIJU, the "Escola d'Estiu" and especially Ignacio Segui, Juan Cano, Miguelon Gimenez, and Javier Irimia. This work would not have been possible without their collaboration. The ALF3D project (TIN2009-14103-03) for the autostereoscopic display. Roberto Vivo, Rafa Gaitan, Severino Gonzalez, and M. Jose Vicent, for their help. The children's parents who signed the agreement to allow their children to participate in the study. The children who participated in the study. The ETSInf for letting us use its facilities during the testing phase.Arino, J.; Juan Lizandra, MC.; Gil Gómez, JA.; Mollá Vayá, RP. (2014). A comparative study using an autostereoscopic display with augmented and virtual reality. Behaviour and Information Technology. 33(6):646-655. https://doi.org/10.1080/0144929X.2013.815277S646655336Azuma, R. T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments, 6(4), 355-385. doi:10.1162/pres.1997.6.4.355Blum, T.et al. 2012. Mirracle: augmented reality in-situ visualization of human anatomy using a magic mirror.In: IEEE virtual reality workshops, 4–8 March 2012, Costa Mesa, CA, USA. Washington, DC: IEEE Computer Society, 169–170.Botden, S. M. B. I., Buzink, S. N., Schijven, M. P., & Jakimowicz, J. J. (2007). Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference? World Journal of Surgery, 31(4), 764-772. doi:10.1007/s00268-006-0724-yChittaro, L., & Ranon, R. (2007). Web3D technologies in learning, education and training: Motivations, issues, opportunities. Computers & Education, 49(1), 3-18. doi:10.1016/j.compedu.2005.06.002Dodgson, N. A. (2005). Autostereoscopic 3D displays. Computer, 38(8), 31-36. doi:10.1109/mc.2005.252Ehara, J., & Saito, H. (2006). Texture overlay for virtual clothing based on PCA of silhouettes. 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality. doi:10.1109/ismar.2006.297805Eisert, P., Fechteler, P., & Rurainsky, J. (2008). 3-D Tracking of shoes for Virtual Mirror applications. 2008 IEEE Conference on Computer Vision and Pattern Recognition. doi:10.1109/cvpr.2008.4587566Fiala, M. (2007). Magic Mirror System with Hand-held and Wearable Augmentations. 2007 IEEE Virtual Reality Conference. doi:10.1109/vr.2007.352493Froner, B., Holliman, N. S., & Liversedge, S. P. (2008). A comparative study of fine depth perception on two-view 3D displays. Displays, 29(5), 440-450. doi:10.1016/j.displa.2008.03.001Holliman, N. S., Dodgson, N. A., Favalora, G. E., & Pockett, L. (2011). Three-Dimensional Displays: A Review and Applications Analysis. IEEE Transactions on Broadcasting, 57(2), 362-371. doi:10.1109/tbc.2011.2130930Ilgner, J. F. R., Kawai, T., Shibata, T., Yamazoe, T., & Westhofen, M. (2006). Evaluation of stereoscopic medical video content on an autostereoscopic display for undergraduate medical education. Stereoscopic Displays and Virtual Reality Systems XIII. doi:10.1117/12.647591Jeong, J.-S., Park, C., Kim, M., Oh, W.-K., & Yoo, K.-H. (2011). Development of a 3D Virtual Laboratory with Motion Sensor for Physics Education. Ubiquitous Computing and Multimedia Applications, 253-262. doi:10.1007/978-3-642-20975-8_28Jones, J. A., Swan, J. E., Singh, G., Kolstad, E., & Ellis, S. R. (2008). The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. Proceedings of the 5th symposium on Applied perception in graphics and visualization - APGV ’08. doi:10.1145/1394281.1394283Juan, M. C., & Pérez, D. (2010). Using augmented and virtual reality for the development of acrophobic scenarios. Comparison of the levels of presence and anxiety. Computers & Graphics, 34(6), 756-766. doi:10.1016/j.cag.2010.08.001Kaufmann, H., & Csisinko, M. (2011). Wireless Displays in Educational Augmented Reality Applications. Handbook of Augmented Reality, 157-175. doi:10.1007/978-1-4614-0064-6_6Kaufmann, H., & Meyer, B. (2008). Simulating educational physical experiments in augmented reality. ACM SIGGRAPH ASIA 2008 educators programme on - SIGGRAPH Asia ’08. doi:10.1145/1507713.1507717Konrad, J. (2011). 3D Displays. Optical and Digital Image Processing, 369-395. doi:10.1002/9783527635245.ch17Konrad, J., & Halle, M. (2007). 3-D Displays and Signal Processing. IEEE Signal Processing Magazine, 24(6), 97-111. doi:10.1109/msp.2007.905706Kwon, H., & Choi, H.-J. (2012). A time-sequential mutli-view autostereoscopic display without resolution loss using a multi-directional backlight unit and an LCD panel. Stereoscopic Displays and Applications XXIII. doi:10.1117/12.907793Livingston, M. A., Zanbaka, C., Swan, J. E., & Smallman, H. S. (s. f.). Objective measures for the effectiveness of augmented reality. IEEE Proceedings. VR 2005. Virtual Reality, 2005. doi:10.1109/vr.2005.1492798Monahan, T., McArdle, G., & Bertolotto, M. (2008). Virtual reality for collaborative e-learning. Computers & Education, 50(4), 1339-1353. doi:10.1016/j.compedu.2006.12.008Montgomery, D. J., Woodgate, G. J., Jacobs, A. M. S., Harrold, J., & Ezra, D. (2001). Performance of a flat-panel display system convertible between 2D and autostereoscopic 3D modes. Stereoscopic Displays and Virtual Reality Systems VIII. doi:10.1117/12.430813Morphew, M. E., Shively, J. R., & Casey, D. (2004). Helmet-mounted displays for unmanned aerial vehicle control. Helmet- and Head-Mounted Displays IX: Technologies and Applications. doi:10.1117/12.541031Pan, Z., Cheok, A. D., Yang, H., Zhu, J., & Shi, J. (2006). Virtual reality and mixed reality for virtual learning environments. Computers & Graphics, 30(1), 20-28. doi:10.1016/j.cag.2005.10.004Petkov, E. G. (2010). Educational Virtual Reality through a Multiview Autostereoscopic 3D Display. Innovations in Computing Sciences and Software Engineering, 505-508. doi:10.1007/978-90-481-9112-3_86Shen, Y., Ong, S. K., & Nee, A. Y. C. (2011). Vision-Based Hand Interaction in Augmented Reality Environment. International Journal of Human-Computer Interaction, 27(6), 523-544. doi:10.1080/10447318.2011.555297Swan, J. E., Jones, A., Kolstad, E., Livingston, M. A., & Smallman, H. S. (2007). Egocentric depth judgments in optical, see-through augmented reality. IEEE Transactions on Visualization and Computer Graphics, 13(3), 429-442. doi:10.1109/tvcg.2007.1035Urey, H., Chellappan, K. V., Erden, E., & Surman, P. (2011). State of the Art in Stereoscopic and Autostereoscopic Displays. Proceedings of the IEEE, 99(4), 540-555. doi:10.1109/jproc.2010.2098351Zhang, Y., Ji, Q., and Zhang, W., 2010. Multi-view autostereoscopic 3D display.In: International conference on optics photonics and energy engineering, 10–11 May 2010, Wuhan, China. Washington, DC: IEEE Computer Society, 58–61

    Using a Visualization Tool for Studying the Effects of Virtual Environments on Assembly Training

    Get PDF
    The objective of this research is to build and demonstrate the design on Virtual Reality (VR) environment to aid in the understanding of assembly & assembly planning for complex systems. To explain the research work, a Virtual Reality environment was created for assembly simulation of Treadmill parts used in a Spacecraft. This environment will account for space related constraint such as, gravity. This study will help in understanding the assembly of the VR model which might help an assembler to optimize the design related cost. Unity 3D is used for creating the VR environment along with Solidworks to create Treadmill parts. Text and image cues are offered to assist users while performing manual assembly. The assembly sequence followed by users will be compared with an optimized path sequence computed using a Genetic Algorithm for a collision-free layout. To validate the research work, the virtual assembly simulation has been run on four distinct set ups using immersive (VIVE headset) and non-immersive (desktop) virtual reality systems. A series of user studies to achieve two primary objectives has been conducted: 1) Evaluation of simulated representations with respect to different analysis, 2) Identification of best layout in terms of correctness and elapsed time for a virtual environment. A visualization tool was employed to analyze the system and data collected from diverse set ups. An interactive visualization tool based on a Tree-Map visualization technique was developed to represent the hidden patterns in users’ behavior when they perform different tasks in a virtual environment (using head-mounted VR tool and monitor based VR) which also help in concluding that immersive virtual reality with image cue appear to be the best set up in terms of correctness and time. In addition, using this interactive visualization tool, we show the intrinsic statistical relationships between/within diverse groups of participants in the form of chord diagrams.Computer Scienc

    A taxonomy of tasks in dam cracks surveillance for augmented reality application

    Full text link
    Augmented reality is an advanced computational visualization technology that alters how users in the real world can perceive the virtual information. The use of this technology for EAC/FM is being widely investigated. In the scope of dam safety, the constant analysis of concrete behavior is mandatory, searching for clues of pathologies such as cracks. Cracks are relatively common in concrete structures, nevertheless they need to be surveilled due to the risks they offer. The surveillance of cracks involves exhaustive tasks, and for dams, it consists in the execution of a set of complex tasks that demands access to accumulated data and information. Augmented reality can contribute with the visualization process of this information, diminishing the mental workload demand. This paper defines a hierarchical taxonomy of the tasks that are needed in this domain, using Berliner´s taxonomy to classify the tasks, enhancing the understanding of the points where the augmented reality can be used with better results

    Activity approach in design of specialized visualization systems

    Full text link
    The article discusses the application of activity approach in designing specialized interfaces and visualization systems. Activity approach is a psychological theory developed by the Russian academics of the XX century which suggests analyzing professional work as a type of activity. Activity presupposes consciousness, purposefulness and setting tasks, the accomplishment of which is aimed at achieving a goal. Activity can be broken down into actions serving to accomplish the tasks, and actions, in turn, are broken down into operations. The same activity can be carried out through different operations, and the same operations can be combined into different types of activity. Activity approach in interface design is applied to mass and professional instrumental interfaces. The article provides examples of activity analysis in terms of the used instrumental interfaces; it describes approaches to designing real interfaces for medical purposes, considers the design tasks for specialized visualization systems. For this purpose, the phenomenon of insight can be used as one of the criteria of visualization quality. The article also points out the issues of using virtual reality in scientific visualization. It provides the results of the experiment analyzing the influence of the presence phenomenon in virtual reality on the solution of intellectual tasks, and the basics of visualization system user activity. The article discusses the analysis of specialized (both professional and mass) interfaces serving as instruments in purposeful and productive activity. The analysis is carried out from the perspective of the activity theory and several topics in the fields of psychology and physiology. It is generally believed that the history of interface design clarifies some subtle aspects of modern interactive systems. Further, examples of prototype implementations of service interfaces are provided. Future possibilities for introducing the activity approach into the practical design of specialized interactive systems are also under review. © 2017 Lavoisier. All rights reserved
    corecore