3 research outputs found

    Accelerating Virtual Walkthrough with Visual Culling Techniques

    Get PDF
    Abstract-Virtual walkthrough application allows users to navigate and immerse in the generated 3D environment with computer graphics assist. The 3D environment requires a large amount of geometry to make it look realistic. When the number of geometry increase, the performance of the application will become slower. Consequently, it creates a conflict between the needs of realistic and real time. In this paper, we discuss the implementation of visual culling techniques such as view frustum culling, back face culling and occlusion culling in the virtual walkthrough application. We render only what we can see during the application runtime and cull away unnecessary geometry. This will accelerate the performance of the system. Without the culling techniques implemented in virtual reality application such as virtual walkthrough, the system has to allocate a large space of memory to store the geometry data. We have tested these techniques to the Ancient Malacca data. With the visual culling techniques implemented, the virtual walkthrough system can work in real time mode without scarifying realism factor

    Addressing the problem of Interaction in fully immersive Virtual Environments: from raw sensor data to effective devices

    Get PDF
    Immersion into Virtual Reality is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system with images, sound or other stimuli that provide an engrossing total environment. The use of technological devices such as stereoscopic cameras, head-mounted displays, tracking systems and haptic interfaces allows for user experiences providing a physical feeling of being in a realistic world, and the term “immersion” is a metaphoric use of the experience of submersion applied to representation, fiction or simulation. One of the main peculiarity of fully immersive virtual reality is the enhancing of the simple passive viewing of a virtual environment with the ability to manipulate virtual objects inside it. This Thesis project investigates such interfaces and metaphors for the interaction and the manipulation tasks. In particular, the research activity conducted allowed the design of a thimble-like interface that can be used to recognize in real-time the human hand’s orientation and infer a simplified but effective model of the relative hand’s motion and gesture. Inside the virtual environment, users provided with the developed systems will be therefore able to operate with natural hand gestures in order to interact with the scene; for example, they could perform positioning task by moving, rotating and resizing existent objects, or create new ones from scratch. This approach is particularly suitable when there is the need for the user to operate in a natural way, performing smooth and precise movements. Possible applications of the system to the industry are the immersive design in which the user can perform Computer- Aided Design (CAD) totally immersed in a virtual environment, and the operators training, in which the user can be trained on a 3D model in assembling or disassembling complex mechanical machineries, following predefined sequences. The thesis has been organized around the following project plan: - Collection of the relevant State Of The Art - Evaluation of design choices and alternatives for the interaction hardware - Development of the necessary embedded firmware - Integration of the resulting devices in a complex interaction test-bed - Development of demonstrative applications implementing the device - Implementation of advanced haptic feedbac
    corecore