796 research outputs found
MOVIN: Real-time Motion Capture using a Single LiDAR
Recent advancements in technology have brought forth new forms of interactive
applications, such as the social metaverse, where end users interact with each
other through their virtual avatars. In such applications, precise full-body
tracking is essential for an immersive experience and a sense of embodiment
with the virtual avatar. However, current motion capture systems are not easily
accessible to end users due to their high cost, the requirement for special
skills to operate them, or the discomfort associated with wearable devices. In
this paper, we present MOVIN, the data-driven generative method for real-time
motion capture with global tracking, using a single LiDAR sensor. Our
autoregressive conditional variational autoencoder (CVAE) model learns the
distribution of pose variations conditioned on the given 3D point cloud from
LiDAR.As a central factor for high-accuracy motion capture, we propose a novel
feature encoder to learn the correlation between the historical 3D point cloud
data and global, local pose features, resulting in effective learning of the
pose prior. Global pose features include root translation, rotation, and foot
contacts, while local features comprise joint positions and rotations.
Subsequently, a pose generator takes into account the sampled latent variable
along with the features from the previous frame to generate a plausible current
pose. Our framework accurately predicts the performer's 3D global information
and local joint details while effectively considering temporally coherent
movements across frames. We demonstrate the effectiveness of our architecture
through quantitative and qualitative evaluations, comparing it against
state-of-the-art methods. Additionally, we implement a real-time application to
showcase our method in real-world scenarios. MOVIN dataset is available at
\url{https://movin3d.github.io/movin_pg2023/}
Distance and Reddening of the Isolated Dwarf Irregular Galaxy NGC 1156
We present a photometric estimation of the distance and reddening values to
the dwarf irregular galaxy NGC 1156, which is one of the best targets to study
the isolated dwarf galaxies in the nearby universe. We have used the imaging
data sets of the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)
High Resolution Channel (HRC) of the central region of NGC 1156 (26" X 29")
available in the HST archive for this study. From the (U-B, B-V) color-color
diagram, we first estimate the total (foreground + internal) reddening toward
NGC 1156 of E(B-V) =0.35 +/- 0.05 mag, whereas only the foreground reddening
was previously known to be E(B-V)=0.16 mag (Burstein & Heiles) or 0.24 mag
(Schlegel, Finkbeiner, & Davis). Based on the brightest stars method, selecting
the three brightest blue supergiant (BSG) stars with mean B magnitude of
= 21.94 mag and the three brightest red supergiant (RSG) stars with
mean V magnitude of = 22.76 mag, we derive the distance modulus to NGC
1156 to be (m-M)_{0,BSG} = 29.55 mag and (m-M)_{0,RSG} = 29.16 mag. By using
weights of 1 and 1.5 for the distance moduli from using the BSGs and the RSGs,
respectively, we finally obtain the weighted mean distance modulus to NGC 1156
(m-M)_0 = 29.39 +/- 0.20 mag (d = 7.6 +/- 0.7 Mpc), which is in very good
agreement with the previous estimates. Combining the photometry data of this
study with those of Karachentsev et al. gives smaller distance to NGC 1156,
which is discussed together with the limits of the data.Comment: 18 pages, 8 figures, Accepted by PASJ (2012 Apr issue
- …