1 research outputs found

    Thermal-Kinect Fusion Scanning System for Bodyshape Inpainting and Estimation under Clothing

    Get PDF
    In today\u27s interactive world 3D body scanning is necessary in the field of making virtual avatar, apparel industry, physical health assessment and so on. 3D scanners that are used in this process are very costly and also requires subject to be nearly naked or wear a special tight fitting cloths. A cost effective 3D body scanning system which can estimate body parameters under clothing will be the best solution in this regard. In our experiment we build such a body scanning system by fusing Kinect depth sensor and a Thermal camera. Kinect can sense the depth of the subject and create a 3D point cloud out of it. Thermal camera can sense the body heat of a person under clothing. Fusing these two sensors\u27 images could produce a thermal mapped 3D point cloud of the subject and from that body parameters could be estimated even under various cloths. Moreover, this fusion system is also a cost effective one. In our experiment, we introduce a new pipeline for working with our fusion scanning system, and estimate and recover body shape under clothing. We capture Thermal-Kinect fusion images of the subjects with different clothing and produce both full and partial 3D point clouds. To recover the missing parts from our low resolution scan we fit parametric human model on our images and perform boolean operations with our scan data. Further, we measure our final 3D point cloud scan to estimate the body parameters and compare it with the ground truth. We achieve a minimum average error rate of 0.75 cm comparing to other approaches
    corecore