56 research outputs found
Towards Predictive Rendering in Virtual Reality
The strive for generating predictive images, i.e., images representing radiometrically correct renditions of reality, has been a longstanding problem in computer graphics. The exactness of such images is extremely important for Virtual Reality applications like Virtual Prototyping, where users need to make decisions impacting large investments based on the simulated images. Unfortunately, generation of predictive imagery is still an unsolved problem due to manifold reasons, especially if real-time restrictions apply. First, existing scenes used for rendering are not modeled accurately enough to create predictive images. Second, even with huge computational efforts existing rendering algorithms are not able to produce radiometrically correct images. Third, current display devices need to convert rendered images into some low-dimensional color space, which prohibits display of radiometrically correct images. Overcoming these limitations is the focus of current state-of-the-art research. This thesis also contributes to this task. First, it briefly introduces the necessary background and identifies the steps required for real-time predictive image generation. Then, existing techniques targeting these steps are presented and their limitations are pointed out. To solve some of the remaining problems, novel techniques are proposed. They cover various steps in the predictive image generation process, ranging from accurate scene modeling over efficient data representation to high-quality, real-time rendering. A special focus of this thesis lays on real-time generation of predictive images using bidirectional texture functions (BTFs), i.e., very accurate representations for spatially varying surface materials. The techniques proposed by this thesis enable efficient handling of BTFs by compressing the huge amount of data contained in this material representation, applying them to geometric surfaces using texture and BTF synthesis techniques, and rendering BTF covered objects in real-time. Further approaches proposed in this thesis target inclusion of real-time global illumination effects or more efficient rendering using novel level-of-detail representations for geometric objects. Finally, this thesis assesses the rendering quality achievable with BTF materials, indicating a significant increase in realism but also confirming the remainder of problems to be solved to achieve truly predictive image generation
Immersion design and worldbuilding for the development of introspective and alterbiographical narratives in First Person Exploration video games
Treball final de Grau en Disseny i Desenvolupament de Videojocs. Codi: VJ1241. Curs acadèmic: 2019/2020The hereby document represents the Final Report for a bachelor's thesis on Video Game
Design and Development. The following work consists of the design and development of
Oceans of Reflection, a video game based on introspective and alterbiographical narratives
that allows the player to approach their own being through mechanics and environment.
More specifically, the aim is to enable the player to construct the narrative jointly with us,
the developers, moving away from traditional narratives and adapting the experience for
each of the users in such a way that it is completely personalized. The video game could be
classified in the First Person Exploration and Puzzle Game genre and it will be developed in
Unity 3D for PC using, inter alia, technologies such as Unity Shader Graph and voice
recognition
Real-time simulation and visualisation of cloth using edge-based adaptive meshes
Real-time rendering and the animation of realistic virtual environments and characters
has progressed at a great pace, following advances in computer graphics hardware
in the last decade. The role of cloth simulation is becoming ever more important in
the quest to improve the realism of virtual environments.
The real-time simulation of cloth and clothing is important for many applications
such as virtual reality, crowd simulation, games and software for online clothes shopping.
A large number of polygons are necessary to depict the highly
exible nature of
cloth with wrinkling and frequent changes in its curvature. In combination with the
physical calculations which model the deformations, the effort required to simulate
cloth in detail is very computationally expensive resulting in much diffculty for its
realistic simulation at interactive frame rates. Real-time cloth simulations can lack
quality and realism compared to their offline counterparts, since coarse meshes must
often be employed for performance reasons.
The focus of this thesis is to develop techniques to allow the real-time simulation of
realistic cloth and clothing. Adaptive meshes have previously been developed to act as
a bridge between low and high polygon meshes, aiming to adaptively exploit variations
in the shape of the cloth. The mesh complexity is dynamically increased or refined to
balance quality against computational cost during a simulation. A limitation of many
approaches is they do not often consider the decimation or coarsening of previously
refined areas, or otherwise are not fast enough for real-time applications.
A novel edge-based adaptive mesh is developed for the fast incremental refinement
and coarsening of a triangular mesh. A mass-spring network is integrated into
the mesh permitting the real-time adaptive simulation of cloth, and techniques are
developed for the simulation of clothing on an animated character
Real-time fur modeling with simulation of physical effects
Ankara : The Department of Computer Engineering and the Graduate School of Engineering and Science of Bilkent University, 2012.Thesis (Master's) -- Bilkent University, 2012.Includes bibliographical references leaves 51-54.Fur is one of the important visual aspects of animals and it is quite challenging
to model it in computer graphics. This is due to rendering and animating high
amounts of geometry taking excessive time in our personal computers. Thus in
computer games most of the animals are without fur or covered with a single
layer of texture. But these current methods do not provide the reality and even
if the rendering in the game is realistic the fur is omitted.
There have been several models to render a fur, but the methods that incorporate
rendering are not in real-time, on the other hand most of the real-time
methods omit many of the natural aspects , such as; texture lighting, shadow and
animation. Thus the outcome is not sufficient for realistic gaming experience.
In this thesis we propose a real-time fur represantation that can be used on
3D objects. Moreover, we demonstrate how to; render, animate and burn this
real-time fur.Arıyürek, SinanM.S
- …