360 research outputs found
Constrained inverse volume rendering for planetary nebulae
Journal ArticleDetermining the three-dimensional structure of distant astronomical objects is a challenging task, given that terrestrial observations provide only one viewpoint. For this task, bipolar planetary nebulae are interesting objects of study because of their pronounced axial symmetry due to fundamental physical processes. Making use of this symmetry constraint, we present a technique to automatically recover the asymmetric structure of bipolar planetary nebulae from two-dimensional images. With GPU-based volume rendering driving a non-linear optimization, we estimate the nebula's local emission density as a function of its radial and axial coordinates, and we recover the orientation of the nebula relative to Earth. The optimization refines the nebula model and its orientation by minimizing the differences between the rendered image and the original astronomical image. The resulting model enables realistic 3D visualizations of planetary nebulae, e.g. for educational purposes in planetarium shows. In addition, the recovered spatial distribution of the emissive gas allows validating computer simulation results of the astrophysical formation processes of planetary nebulae
Tex2Shape: Detailed Full Human Body Geometry From a Single Image
We present a simple yet effective method to infer detailed full human body
shape from only a single photograph. Our model can infer full-body shape
including face, hair, and clothing including wrinkles at interactive
frame-rates. Results feature details even on parts that are occluded in the
input image. Our main idea is to turn shape regression into an aligned
image-to-image translation problem. The input to our method is a partial
texture map of the visible region obtained from off-the-shelf methods. From a
partial texture, we estimate detailed normal and vector displacement maps,
which can be applied to a low-resolution smooth body model to add detail and
clothing. Despite being trained purely with synthetic data, our model
generalizes well to real-world photographs. Numerous results demonstrate the
versatility and robustness of our method
Reconstruction and visualization of planetary nebulae
Journal ArticleAbstract-From our terrestrially confined viewpoint, the actual three-dimensional shape of distant astronomical objects is, in general, very challenging to determine. For one class of astronomical objects, however, spatial structure can be recovered from conventional 2D images alone. So-called planetary nebulae (PNe) exhibit pronounced symmetry characteristics that come about due to fundamental physical processes. Making use of this symmetry constraint, we present a technique to automatically recover the axisymmetric structure of many planetary nebulae from photographs. With GPU-based volume rendering driving a nonlinear optimization, we estimate the nebula's local emission density as a function of its radial and axial coordinates and we recover the orientation of the nebula relative to Earth. The optimization refines the nebula model and its orientation by minimizing the differences between the rendered image and the original astronomical image. The resulting model allows creating realistic 3D visualizations of these nebulae, for example, for planetarium shows and other educational purposes. In addition, the recovered spatial distribution of the emissive gas can help astrophysicists gain deeper insight into the formation processes of planetary nebulae
Shape: A 3D Modeling Tool for Astrophysics
We present a flexible interactive 3D morpho-kinematical modeling application
for astrophysics. Compared to other systems, our application reduces the
restrictions on the physical assumptions, data type and amount that is required
for a reconstruction of an object's morphology. It is one of the first publicly
available tools to apply interactive graphics to astrophysical modeling. The
tool allows astrophysicists to provide a-priori knowledge about the object by
interactively defining 3D structural elements. By direct comparison of model
prediction with observational data, model parameters can then be automatically
optimized to fit the observation. The tool has already been successfully used
in a number of astrophysical research projects.Comment: 13 pages, 11 figures, accepted for publication in the "IEEE
Transactions on Visualization and Computer Graphics
Constrained Inverse Volume Rendering for Planetary Nebulae
Determining the three-dimensional structure of distant astronomical objects is a challenging task, given that terrestrial observations provide only one viewpoint. For this task, bipolar planetary nebulae are interesting objects of study because of their pronounced axial symmetry due to fundamental physical processes. Making use of this symmetry constraint, we present a technique to automatically recover the axisymmetric structure of bipolar planetary nebulae from two-dimensional images. With GPU-based volume rendering driving a non-linear optimization, we estimate the nebula 's local emission density as a function of its radial and axial coordinates, and we recover the orientation of the nebula relative to Earth. The optimization refines the nebula model and its orientation by minimizing the differences between the rendered image and the original astronomical image. The resulting model enables realistic 3D visualizations of planetary nebulae, e.g. for educational purposes in planetarium shows. In addition, the recovered spatial distribution of the emissive gas allows validating computer simulation results of the astrophysical formation processes of planetary nebulae
10411 Abstracts Collection -- Computational Video
From 10.10.2010 to 15.10.2010, the Dagstuhl Seminar 10411 ``Computational Video \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics.
During the seminar, several participants presented their current
research, and ongoing work and open problems were discussed. Abstracts of
the presentations given during the seminar as well as abstracts of
seminar results and ideas are put together in this paper. The first section
describes the seminar topics and goals in general.
Links to extended abstracts or full papers are provided, if available
ZIPMAPS: Zoom-in-bestimmte-Bereiche Texturen
In this technical report, we propose a method for rendering highly detailed close-up views of arbitrary textured surfaces. To augment the texture map locally with high-resolution information, we describe how to automatically, seamlessly merge unregistered images of different scales. Our hierarchical texture representation can easily be rendered in real-time, enabling zooming into specific texture regions to almost arbitrary magnification. Our method is useful wherever close-up renderings of specific regions shall be possible, without the need for excessively large texture maps.Wir präsentieren eine neue Methode um sehr detailierte Ansichten von beliebig texturierten Oberflächen zu generieren. Wir beschreiben wie man automatisch und ohne sichtbare Nähte unregistrierte Bilder unterschiedlicher Skalen miteinander kombiniert um lokal hochaufgelöste Detailinformationen hinzuzufügen. Unsere hierarchische Texturrepräsentation kann sehr einfach und in Echtzeit gerendert werden und erlaubt somit den Zoom in bestimmte Textureregionen mit nahezu beliebiger Vergrößerung. Unsere Methode ist immer dann sinnvoll, wenn Vergrößerungen entsprechender Bereiche notwendig sind, ohne dass man entsprechend große Texturen speichern möchte
- …