27 research outputs found

    Multi-character Motion Retargeting for Large-Scale Transformations

    Get PDF
    Unlike single-character motion retargeting, multi-character motion retargeting (MCMR) algorithms should be able to retarget each character’s motion correcly while maintaining the interaction between them. Existing MCMR solutions mainly focus on small scale changes between interacting characters. However, many retargeting applications require large-scale transformations. In this paper, we propose a new algorithm for large-scale MCMR. We build on the idea of interaction meshes, which are structures representing the spatial relationship among characters. We introduce a new distance-based interaction mesh that embodies the relationship between characters more accurately by prioritizing local connections over global ones. We also introduce a stiffness weight for each skeletal joint in our mesh deformation term, which defines how undesirable it is for the interaction mesh to deform around that joint. This parameter increases the adaptability of our algorithm for large-scale transformations and reduces optimization time considerably. We compare the performance of our algorithm with current state-of-the-art MCMR solution for several motion sequences under four different scenarios. Our results show that our method not only improves the quality of retargeting, but also significantly reduces computation time.This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992 </p

    Application of Interactive Deformation to Assembled Mesh Models for CAE Analysis

    Full text link

    Simple and Efficient Mesh Editing with Consistent Local Frames

    Full text link

    Assessing Facial Symmetry and Attractiveness using Augmented Reality

    Get PDF
    Facial symmetry is a key component in quantifying the perception of beauty. In this paper, we propose a set of facial features computed from facial landmarks which can be extracted at a low computational cost. We quantitatively evaluated our proposed features for predicting perceived attractiveness from human portraits on four benchmark datasets (SCUT-FBP, SCUT-FBP5500, FACES and Chicago Face Database). Experimental results showed that the performance of our features is comparable to those extracted from a set with much denser facial landmarks. The computation of facial features was also implemented as an Augmented Reality (AR) app developed on Android OS. The app overlays four types of measurements and guide lines over a live video stream, while the facial measurements are computed from the tracked facial landmarks at run-time. The developed app can be used to assist plastic surgeons in assessing facial symmetry when planning reconstructive facial surgeries
    corecore