In this work, we propose a novel clothed human reconstruction method called
GaussianBody, based on 3D Gaussian Splatting. Compared with the costly neural
radiance based models, 3D Gaussian Splatting has recently demonstrated great
performance in terms of training time and rendering quality. However, applying
the static 3D Gaussian Splatting model to the dynamic human reconstruction
problem is non-trivial due to complicated non-rigid deformations and rich cloth
details. To address these challenges, our method considers explicit pose-guided
deformation to associate dynamic Gaussians across the canonical space and the
observation space, introducing a physically-based prior with regularized
transformations helps mitigate ambiguity between the two spaces. During the
training process, we further propose a pose refinement strategy to update the
pose regression for compensating the inaccurate initial estimation and a
split-with-scale mechanism to enhance the density of regressed point clouds.
The experiments validate that our method can achieve state-of-the-art
photorealistic novel-view rendering results with high-quality details for
dynamic clothed human bodies, along with explicit geometry reconstruction