6,392 research outputs found

    Shape Animation with Combined Captured and Simulated Dynamics

    Get PDF
    We present a novel volumetric animation generation framework to create new types of animations from raw 3D surface or point cloud sequence of captured real performances. The framework considers as input time incoherent 3D observations of a moving shape, and is thus particularly suitable for the output of performance capture platforms. In our system, a suitable virtual representation of the actor is built from real captures that allows seamless combination and simulation with virtual external forces and objects, in which the original captured actor can be reshaped, disassembled or reassembled from user-specified virtual physics. Instead of using the dominant surface-based geometric representation of the capture, which is less suitable for volumetric effects, our pipeline exploits Centroidal Voronoi tessellation decompositions as unified volumetric representation of the real captured actor, which we show can be used seamlessly as a building block for all processing stages, from capture and tracking to virtual physic simulation. The representation makes no human specific assumption and can be used to capture and re-simulate the actor with props or other moving scenery elements. We demonstrate the potential of this pipeline for virtual reanimation of a real captured event with various unprecedented volumetric visual effects, such as volumetric distortion, erosion, morphing, gravity pull, or collisions

    ADAPT: The Agent Development and Prototyping Testbed

    Get PDF
    We present ADAPT, a flexible platform for designing and authoring functional, purposeful human characters in a rich virtual environment. Our framework incorporates character animation, navigation, and behavior with modular interchangeable components to produce narrative scenes. Our animation system provides locomotion, reaching, gaze tracking, gesturing, sitting, and reactions to external physical forces, and can easily be extended with more functionality due to a decoupled, modular structure. Additionally, our navigation component allows characters to maneuver through a complex environment with predictive steering for dynamic obstacle avoidance. Finally, our behavior framework allows a user to fully leverage a character’s animation and navigation capabilities when authoring both individual decision-making and complex interactions between actors using a centralized, event-driven model

    A 3D immersive discrete event simulator for enabling prototyping of factory layouts

    Get PDF
    There is an increasing need to eliminate wasted time and money during factory layout design and subsequent construction. It is presently difficult for engineers to foresee if a certain layout is optimal for work and material flows. By exploiting modelling, simulation and visualisation techniques, this paper presents a tool concept called immersive WITNESS that combines the modelling strengths of Discrete Event Simulation (DES) with the 3D visualisation strengths of recent 3D low cost gaming technology to enable decision makers make informed design choices for future factories layouts. The tool enables engineers to receive immediate feedback on their design choices. Our results show that this tool has the potential to reduce rework as well as the associated costs of making physical prototypes

    A Framework for Designing 3d Virtual Environments

    Get PDF
    The process of design and development of virtual environments can be supported by tools and frameworks, to save time in technical aspects and focusing on the content. In this paper we present an academic framework which provides several levels of abstraction to ease this work. It includes state-of-the-art components we devised or integrated adopting open-source solutions in order to face specific problems. Its architecture is modular and customizable, the code is open-source.\u

    Towards music-driven procedural animation

    Get PDF
    We present our approach towards the development of a framework for the creation of music-driven procedural animations. We intend to explore the potential that elementary musical features hold for driving engaging audio-visual animations. To do so, we bring forward an integrated environment where real-time musical information is available and may be flexibly used for manipulating different aspects of a dynamic animation. In general terms, our approach consists of developing a virtual scene, populated by controllable entities, termed actors, and using scripting to define how these actors' behaviour or appearance change in response to musical information. Scripting operates by establishing associations, or mappings, between musical events, such as the ringing of notes or chords, or sound information, such as the frequency spectrum, and changes in the animation. The scenario we chose to explore is comprised of two main actors: trees and wind. Trees grow in an iterative process, and may develop leaves, while swaying in response to the wind field. The wind is represented as a vector field whose configuration and strength can be altered in real-time. Scripting then allows for synchronising these changes with musical events, providing a natural sense of harmony with the accompanying music. By having real-time access to musical information, as well as control over a reactive animation we believe to have taken a first step towards exploring a novel interdisciplinary concept with vast expressive potential.This work has been supported by national funds through FCT – Fundação para a Ciência e Tecnologia within the Project Scope: UID/CEC/00319/2019
    • …
    corecore