6 research outputs found

    Rigidity controllable as-rigid-as-possible shape deformations

    Get PDF
    Shape deformation is one of the fundamental techniques in geometric processing. One principle of deformation is to preserve the geometric details while distributing the necessary distortions uniformly. To achieve this, state-of-the-art techniques deform shapes in a locally as-rigid-as-possible (ARAP) manner. Existing ARAP deformation methods optimize rigid transformations in the 1-ring neighborhoods and maintain the consistency between adjacent pairs of rigid transformations by single overlapping edges. In this paper, we make one step further and propose to use larger local neighborhoods to enhance the consistency of adjacent rigid transformations. This is helpful to keep the geometric details better and distribute the distortions more uniformly. Moreover, the size of the expanded local neighborhoods provides an intuitive parameter to adjust physical stiffness. The larger the neighborhood is, the more rigid the material is. Based on these, we propose a novel rigidity controllable mesh deformation method where shape rigidity can be flexibly adjusted. The size of the local neighborhoods can be learned from datasets of deforming objects automatically or specified by the user, and may vary over the surface to simulate shapes composed of mixed materials. Various examples are provided to demonstrate the effectiveness of our method

    Data-driven weight optimization for real-time mesh deformation

    Get PDF
    3D model deformation has been an active research topic in geometric processing. Due to its efficiency, linear blend skinning (LBS) and its follow-up methods are widely used in practical applications as an efficient method for deforming vector images, geometric models and animated characters. LBS needs to determine the control handles and specify their influence weights, which requires expertise and is time-consuming. Further studies have proposed a method for efficiently calculating bounded biharmonic weights of given control handles which reduces user effort and produces smooth deformation results. The algorithm defines a high-order shape-aware smoothness function which tends to produce smooth deformation results, but fails to generate locally rigid deformations. To address this, we propose a novel data-driven approach to producing improved weights for handles that makes full use of available 3D model data by optimizing an energy consisting of data-driven, rigidity and sparsity terms, while maintaining its advantage of allowing handles of various forms. We further devise an efficient iterative optimization scheme. Through contrast experiments, it clearly shows that linear blend skinning based on our optimized weights better reflects the deformation characteristics of the model, leading to more accurate deformation results, outperforming existing methods. The method also retains real-time performance even with a large number of deformation examples. Our ablation experiments also show that each energy term is essential

    Sparse data driven mesh deformation

    Get PDF
    Example-based mesh deformation methods are powerful tools for realistic shape editing. However, existing techniques typically combine all the example deformation modes, which can lead to overfitting, i.e. using an overly complicated model to explain the user-specified deformation. This leads to implausible or unstable deformation results, including unexpected global changes outside the region of interest. To address this fundamental limitation, we propose a sparse blending method that automatically selects a smaller number of deformation modes to compactly describe the desired deformation. This along with a suitably chosen deformation basis including spatially localized deformation modes leads to significant advantages, including more meaningful, reliable, and efficient deformations because fewer and localized deformation modes are applied. To cope with large rotations, we develop a simple but effective representation based on polar decomposition of deformation gradients, which resolves the ambiguity of large global rotations using an as-consistent-as-possible global optimization. This simple representation has a closed form solution for derivatives, making it efficient for our sparse localized representation and thus ensuring interactive performance. Experimental results show that our method outperforms state-of-the-art data-driven mesh deformation methods, for both quality of results and efficiency

    Efficient facial animation integrating Euclidean and geodesic distance-based algorithms into radial basis function interpolation.

    Get PDF
    Facial animation has been an important research topic during the last decades. There has been considerable effort on the efficient creation of realistic and believable facial expressions. Among various approaches for creating believable movements in human facial features, one of the most common utilises motion capture. This thesis explores the current approaches on facial animation with this technology together with Radial Basis Function (RBF) interpolation, covering a review of Euclidean and geodesic distance-based algorithms, and proposing a hybrid approach that tries to take the advantages of the two previous methods aided by pre-processed distance data to fasten the computations. Using motion capture performance based on the Facial Action Coding System (FACS), the results are then evaluated with a wide range of facial expressions in both a realistic and a stylised facial model. The findings of this thesis show the advantage of the hybrid RBF approach proposed which, combined with pre-processed distance data, results in a more efficient and more accurate process for the generation of high-detail facial animation with motion capture

    Dynamic skin deformation simulation using musculoskeletal model and soft tissue dynamics

    No full text
    Abstract Deformation of skin and muscle is essential for bringing an animated character to life. This deformation is difficult to animate in a realistic fashion using traditional techniques because of the subtlety of the skin deformations that must move appropriately for the character design. In this paper, we present an algorithm that generates natural, dynamic, and detailed skin deformation (movement and jiggle) from joint angle data sequences. The algorithm has two steps: identification of parameters for a quasi-static muscle deformation model, and simulation of skin deformation. In the identification step, we identify the model parameters using a musculoskeletal model and a short sequence of skin deformation data captured via a dense marker set. The simulation step first uses the quasi-static muscle deformation model to obtain the quasi-static muscle shape at each frame of the given motion sequence (slow jump). Dynamic skin deformation is then computed by simulating the passive muscle and soft tissue dynamics modeled as a mass–spring–damper system. Having obtained the model parameters, we can simulate dynamic skin deformations for subjects with similar body types from new motion data. We demonstrate our method by creating skin deformations for muscle co-contraction and external impacts from four different behaviors captured as skeletal motion capture data. Experimental results show that the simulated skin deformations are quantitatively and qualitatively similar to measured actual skin deformations
    corecore