31 research outputs found

    Neural Smoke Stylization with Color Transfer

    Full text link
    Artistically controlling fluid simulations requires a large amount of manual work by an artist. The recently presented transportbased neural style transfer approach simplifies workflows as it transfers the style of arbitrary input images onto 3D smoke simulations. However, the method only modifies the shape of the fluid but omits color information. In this work, we therefore extend the previous approach to obtain a complete pipeline for transferring shape and color information onto 2D and 3D smoke simulations with neural networks. Our results demonstrate that our method successfully transfers colored style features consistently in space and time to smoke data for different input textures.Comment: Submitted to Eurographics202

    Transport-Based Neural Style Transfer for Smoke Simulations

    Full text link
    Artistically controlling fluids has always been a challenging task. Optimization techniques rely on approximating simulation states towards target velocity or density field configurations, which are often handcrafted by artists to indirectly control smoke dynamics. Patch synthesis techniques transfer image textures or simulation features to a target flow field. However, these are either limited to adding structural patterns or augmenting coarse flows with turbulent structures, and hence cannot capture the full spectrum of different styles and semantically complex structures. In this paper, we propose the first Transport-based Neural Style Transfer (TNST) algorithm for volumetric smoke data. Our method is able to transfer features from natural images to smoke simulations, enabling general content-aware manipulations ranging from simple patterns to intricate motifs. The proposed algorithm is physically inspired, since it computes the density transport from a source input smoke to a desired target configuration. Our transport-based approach allows direct control over the divergence of the stylization velocity field by optimizing incompressible and irrotational potentials that transport smoke towards stylization. Temporal consistency is ensured by transporting and aligning subsequent stylized velocities, and 3D reconstructions are computed by seamlessly merging stylizations from different camera viewpoints.Comment: ACM Transaction on Graphics (SIGGRAPH ASIA 2019), additional materials: http://www.byungsoo.me/project/neural-flow-styl

    An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    Get PDF
    ©2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or distribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.DOI: 10.1109/TIP.2007.896637Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The 2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods

    Reviews on Physically Based Controllable Fluid Animation

    Get PDF
    In computer graphics animation, animation tools are required for fluid-like motions which are controllable by users or animator, since applying the techniques to commercial animations such as advertisement and film. Many developments have been proposed to model controllable fluid simulation with the need in realistic motion, robustness, adaptation, and support more required control model. Physically based models for different states of substances have been applied in general in order to permit animators to almost effortlessly create interesting, realistic, and sensible animation of natural phenomena such as water flow, smoke spread, etc. In this paper, we introduce the methods for simulation based on physical model and the techniques for control the flow of fluid, especially focus on particle based method. We then discuss the existing control methods within three performances; control ability, realism, and computation time. Finally, we give a brief of the current and trend of the research areas

    Space-time editing of elastic motion through material optimization and reduction

    Get PDF
    We present a novel method for elastic animation editing with space-time constraints. In a sharp departure from previous approaches, we not only optimize control forces added to a linearized dynamic model, but also optimize material properties to better match user constraints and provide plausible and consistent motion. Our approach achieves efficiency and scalability by performing all computations in a reduced rotation-strain (RS) space constructed with both cubature and geometric reduction, leading to two orders of magnitude improvement over the original RS method. We demonstrate the utility and versatility of our method in various applications, including motion editing, pose interpolation, and estimation of material parameters from existing animation sequences

    Lagrangian Neural Style Transfer for Fluids

    Full text link
    Artistically controlling the shape, motion and appearance of fluid simulations pose major challenges in visual effects production. In this paper, we present a neural style transfer approach from images to 3D fluids formulated in a Lagrangian viewpoint. Using particles for style transfer has unique benefits compared to grid-based techniques. Attributes are stored on the particles and hence are trivially transported by the particle motion. This intrinsically ensures temporal consistency of the optimized stylized structure and notably improves the resulting quality. Simultaneously, the expensive, recursive alignment of stylization velocity fields of grid approaches is unnecessary, reducing the computation time to less than an hour and rendering neural flow stylization practical in production settings. Moreover, the Lagrangian representation improves artistic control as it allows for multi-fluid stylization and consistent color transfer from images, and the generality of the method enables stylization of smoke and liquids likewise.Comment: ACM Transaction on Graphics (SIGGRAPH 2020), additional materials: http://www.byungsoo.me/project/lnst/index.htm

    Detail-preserving fluid control

    Get PDF

    Editing smoke animation using a deforming grid

    Get PDF
    Abstract We present a new method for editing smoke animations by directly deforming the grid used for simulation. We present a modification to the widely used semi-Lagrangian advection operator and use it to transfer the deformation from the grid to the smoke body. Our modified operator bends the smoke particle streamlines according to the deformation gradient. We demonstrate that the controlled smoke animation preserves the fine-grained vortical velocity components and incompressibility constraints, while conforming to the deformed grid. Moreover, our approach enables interactive 3D smoke animation editing by using a reduced-dimensional subspace. Overall, our method makes it possible to use current mesh editing tools to control the smoke body

    Fluid control using the adjoint method

    Full text link
    corecore