2 research outputs found
Perceptually Uniform Construction of Illustrative Textures
Illustrative textures, such as stippling or hatching, were predominantly used
as an alternative to conventional Phong rendering. Recently, the potential of
encoding information on surfaces or maps using different densities has also
been recognized. This has the significant advantage that additional color can
be used as another visual channel and the illustrative textures can then be
overlaid. Effectively, it is thus possible to display multiple information,
such as two different scalar fields on surfaces simultaneously. In previous
work, these textures were manually generated and the choice of density was
unempirically determined. Here, we first want to determine and understand the
perceptual space of illustrative textures. We chose a succession of simplices
with increasing dimensions as primitives for our textures: Dots, lines, and
triangles. Thus, we explore the texture types of stippling, hatching, and
triangles. We create a range of textures by sampling the density space
uniformly. Then, we conduct three perceptual studies in which the participants
performed pairwise comparisons for each texture type. We use multidimensional
scaling (MDS) to analyze the perceptual spaces per category. The perception of
stippling and triangles seems relatively similar. Both are adequately described
by a 1D manifold in 2D space. The perceptual space of hatching consists of two
main clusters: Crosshatched textures, and textures with only one hatching
direction. However, the perception of hatching textures with only one hatching
direction is similar to the perception of stippling and triangles. Based on our
findings, we construct perceptually uniform illustrative textures. Afterwards,
we provide concrete application examples for the constructed textures.Comment: 11 pages, 15 figures, to be published in IEEE Transactions on
Visualization and Computer Graphic
Weighted linde-buzo-gray stippling
We propose an adaptive version of Lloyd's optimization method that distributes points based on Voronoi diagrams. Our inspiration is the Linde-Buzo-Gray-Algorithm in vector quantization, which dynamically splits Voronoi cells until a desired number of representative vectors is reached. We reformulate this algorithm by splitting and merging Voronoi cells based on their size, greyscale level, or variance of an underlying input image. The proposed method automatically adapts to various constraints and, in contrast to previous work, requires no good initial point distribution or prior knowledge about the final number of points. Compared to weighted Voronoi stippling the convergence rate is much higher and the spectral and spatial properties are superior. Further, because points are created based on local operations, coherent stipple animations can be produced. Our method is also able to produce good quality point sets in other fields, such as remeshing of geometry, based on local geometric features such as curvature.publishe