60 research outputs found
Extraction of topological structures in 2D and 3D vector fields
feature extraction, feature tracking, vector field visualizationMagdeburg, Univ., Fak. fĂĽr Informatik, Diss., 2008von Tino WeinkaufZsfassung in dt. Sprach
A Low-Dimensional Representation for Robust Partial Isometric Correspondences Computation
Intrinsic isometric shape matching has become the standard approach for pose
invariant correspondence estimation among deformable shapes. Most existing
approaches assume global consistency, i.e., the metric structure of the whole
manifold must not change significantly. While global isometric matching is well
understood, only a few heuristic solutions are known for partial matching.
Partial matching is particularly important for robustness to topological noise
(incomplete data and contacts), which is a common problem in real-world 3D
scanner data. In this paper, we introduce a new approach to partial, intrinsic
isometric matching. Our method is based on the observation that isometries are
fully determined by purely local information: a map of a single point and its
tangent space fixes an isometry for both global and the partial maps. From this
idea, we develop a new representation for partial isometric maps based on
equivalence classes of correspondences between pairs of points and their
tangent spaces. From this, we derive a local propagation algorithm that find
such mappings efficiently. In contrast to previous heuristics based on RANSAC
or expectation maximization, our method is based on a simple and sound
theoretical model and fully deterministic. We apply our approach to register
partial point clouds and compare it to the state-of-the-art methods, where we
obtain significant improvements over global methods for real-world data and
stronger guarantees than previous heuristic partial matching algorithms.Comment: 17 pages, 12 figure
Combinatorial Gradient Fields for 2D Images with Empirically Convergent Separatrices
This paper proposes an efficient probabilistic method that computes
combinatorial gradient fields for two dimensional image data. In contrast to
existing algorithms, this approach yields a geometric Morse-Smale complex that
converges almost surely to its continuous counterpart when the image resolution
is increased. This approach is motivated using basic ideas from probability
theory and builds upon an algorithm from discrete Morse theory with a strong
mathematical foundation. While a formal proof is only hinted at, we do provide
a thorough numerical evaluation of our method and compare it to established
algorithms.Comment: 17 pages, 7 figure
Recommended from our members
A network-based detection scheme for the jet stream core
The polar and subtropical jet streams are strong upper-level winds with a crucial influence on weather
throughout the Northern Hemisphere midlatitudes. In particular, the polar jet is located between cold arctic air
to the north and warmer subtropical air to the south. Strongly meandering states therefore often lead to extreme
surface weather.
Some algorithms exist which can detect the 2-D (latitude and longitude) jets’ core around the hemisphere,
but all of them use a minimal threshold to determine the subtropical and polar jet stream. This is particularly
problematic for the polar jet stream, whose wind velocities can change rapidly from very weak to very high
values and vice versa.
We develop a network-based scheme using Dijkstra’s shortest-path algorithm to detect the polar and subtropical
jet stream core. This algorithm not only considers the commonly used wind strength for core detection
but also takes wind direction and climatological latitudinal position into account. Furthermore, it distinguishes
between polar and subtropical jet, and between separate and merged jet states.
The parameter values of the detection scheme are optimized using simulated annealing and a skill function
that accounts for the zonal-mean jet stream position (Rikus, 2015). After the successful optimization process,
we apply our scheme to reanalysis data covering 1979–2015 and calculate seasonal-mean probabilistic maps and
trends in wind strength and position of jet streams.
We present longitudinally defined probability distributions of the positions for both jets for all on the Northern
Hemisphere seasons. This shows that winter is characterized by two well-separated jets over Europe and Asia
(ca. 20Wto 140 E). In contrast, summer normally has a single merged jet over the western hemisphere but can
have both merged and separated jet states in the eastern hemisphere.
With this algorithm it is possible to investigate the position of the jets’ cores around the hemisphere and it
is therefore very suitable to analyze jet stream patterns in observations and models, enabling more advanced
model-validation
Impact of in vitro Tear Film Composition on Lysozyme Deposition and Denaturation
Purpose
To study the impact of lactoferrin and lipids on the kinetic deposition and denaturation of lysozyme on contact lens materials.
Methods
The contact lenses investigated in this thesis included two silicone hydrogel lenses [AIR OPTIX AQUA; lotrafilcon B and ACUVUE OASYS; senofilcon A] and two conventional hydrogel lenses [ACUVUE 2; etafilcon A and PROCLEAR; omafilcon A]. All lenses were incubated in four solutions: a complex artificial tear solution (ATS); an ATS without lactoferrin; an ATS without lipids; and an ATS without lactoferrin and lipids. Following various time points, all lenses were prepared for lysozyme analysis using the methods below:
• To quantify the kinetic uptake of lysozyme to different contact lens materials, I125-radiolabelled lysozyme was added to each incubation solution. Total lysozyme deposition was quantified using a gamma counter.
• To study the activity of lysozyme deposited to contact lenses, a fluorescence-based lysozyme activity assay was compared to a turbidity assay. Potential interactions with lens materials and extraction solvents were evaluated.
• To investigate the kinetic denaturation of lysozyme deposited to different contact lens materials, the fluorescence-based activity assay and the enzyme-linked immunosorbent assay were used.
Results
The presence of lactoferrin and lipids decreased lysozyme uptake to lotrafilcon B. Lysozyme deposition on senofilcon A was greater in the absence of lipids after day 21, however the opposite was seen with etafilcon A, where lysozyme uptake was lower without lipids in the ATS. Lactoferrin and/or lipids had no effect on lysozyme adsorption to omafilcon A.
The fluorescence-based lysozyme activity assay demonstrated high sensitivity and a wide linear range of detection, which covers the amount of lysozyme typically extracted from contact lenses. Using this assay, lysozyme activity on both silicone hydrogel materials was lower in the presence of lipids in the ATS. In addition, lactoferrin had a protective effect on lysozyme activity for lysozyme sorbed to senofilcon A. Moreover, the presence of lactoferrin and/or lipids did not exhibit any effect on lysozyme denaturation with conventional hydrogel lenses.
Conclusions
The presence of lactoferrin and lipids in an artificial tear solution impacted lysozyme deposition and denaturation of lysozyme on various contact lenses. It is important for in vitro studies, when developing tear film models, to consider the effects of tear film components when investigating protein deposition and denaturation on contact lenses
Flow maps - Benefits, Problems, Future Research
The flow map has become a standard tool for the
analysis and visualization of unsteady flows. In simple
terms, it maps the start point of a particle integration
to its end point. Flow maps are used to compute
Finite Time Lyapunov Exponents (FTLE), Streak Line
Vector Fields, or to speed up other methods in flow
visualization. However, they are very costly in terms
of both computation time and storage.
In this talk, I will give an overview of the latest
developments in flow visualization, review the theoretical and practical benefits of flow
maps, discuss issues of accuracy and complexity, and pose open questions for future
research in this area
Temporal Merge Tree Maps: A Topology-Based Static Visualization for Temporal Scalar Data
Creating a static visualization for a time-dependent scalar field is a non-trivial task, yet very insightful as it shows the dynamics in one picture. Existing approaches are based on a linearization of the domain or on feature tracking. Domain linearizations use space-filling curves to place all sample points into a 1D domain, thereby breaking up individual features. Feature tracking methods explicitly respect feature continuity in space and time, but generally neglect the data context in which those features live. We present a feature-based linearization of the spatial domain that keeps features together and preserves their context by involving all data samples. We use augmented merge trees to linearize the domain and show that our linearized function has the same merge tree as the original data. A greedy optimization scheme aligns the trees over time providing temporal continuity. This leads to a static 2D visualization with one temporal dimension, and all spatial dimensions compressed into one. We compare our method against other domain linearizations as well as feature-tracking approaches, and apply it to several real-world data sets.QC 20221201</p
Temporal Treemaps: Static Visualization of Evolving Trees
We consider temporally evolving trees with changing topology and data: tree nodes may persist for a time range, merge or split, and the associated data may change. Essentially, one can think of this as a time series of trees with a node correspondence per hierarchy level between consecutive time steps. Existing visualization approaches for such data include animated 2D treemaps, where the dynamically changing layout makes it difficult to observe the data in its entirety. We present a method to visualize this dynamic data in a static, nested, and space-filling visualization. This is based on two major contributions: First, the layout constitutes a graph drawing problem. We approach it for the entire time span at once using a combination of a heuristic and simulated annealing. Second, we propose a rendering that emphasizes the hierarchy through an adaption of the classic cushion treemaps. We showcase the wide range of applicability using data from feature tracking in time-dependent scalar fields, evolution of file system hierarchies, and world population.QC 20190107</p
Recommended from our members
Critical Points of the Electric Field from a Collection of Point Charges
The electric field around a molecule is generated by the charge distribution of its constituents: positively charged atomic nuclei, which are well approximated by point charges, and negatively charged electrons, whose probability density distribution can be computed from quantum mechanics. For the purposes of molecular mechanics or dynamics, the charge distribution is often approximated by a collection of point charges, with either a single partial charge at each atomic nucleus position, representing both the nucleus and the electrons near it, or as several different point charges per atom. The critical points in the electric field are useful in visualizing its geometrical and topological structure, and can help in understanding the forces and motion it induces on a charged ion or neutral dipole. Most visualization tools for vector fields use only samples of the field on the vertices of a regular grid, and some sort of interpolation, for example, trilinear, on the grid cells. There is less risk of missing or misinterpreting topological features if they can be derived directly from the analytic formula for the field, rather than from its samples. This work presents a method which is guaranteed to find all the critical points of the electric field from a finite set of point charges. To visualize the field topology, we have modified the saddle connector method to use the analytic formula for the field
- …