1,768 research outputs found
A Note on the Unsolvability of the Weighted Region Shortest Path Problem
Let S be a subdivision of the plane into polygonal regions, where each region
has an associated positive weight. The weighted region shortest path problem is
to determine a shortest path in S between two points s, t in R^2, where the
distances are measured according to the weighted Euclidean metric-the length of
a path is defined to be the weighted sum of (Euclidean) lengths of the
sub-paths within each region. We show that this problem cannot be solved in the
Algebraic Computation Model over the Rational Numbers (ACMQ). In the ACMQ, one
can compute exactly any number that can be obtained from the rationals Q by
applying a finite number of operations from +, -, \times, \div, \sqrt[k]{}, for
any integer k >= 2. Our proof uses Galois theory and is based on Bajaj's
technique.Comment: 6 pages, 1 figur
Applied Similarity Problems Using Frechet Distance
In the first part of this thesis, we consider an instance of Frechet distance
problem in which the speed of traversal along each segment of the curves is
restricted to be within a specfied range. This setting is more realistic than
the classical Frechet distance setting, specially in GIS applications. We also
study this problem in the setting where the polygonal curves are inside a
simple polygon.
In the second part of this thesis, we present a data structure, called the
free-space map, that enables us to solve several variants of the Frechet
distance problem efficiently. Our data structure encapsulates all the
information available in the free-space diagram, yet it is capable of answering
more general type of queries efficiently. Given that the free-space map has the
same size and construction time as the standard free-space diagram, it can be
viewed as a powerful alternative to it. As part of the results in Part II of
the thesis, we exploit the free-space map to improve the long-standing bound
for computing the partial Frechet distance and obtain improved algorithms for
computing the Frechet distance between two closed curves, and the so-called
minimum/maximum walk problem. We also improve the map matching algorithm for
the case when the map is a directed acyclic graph.
As the last part of this thesis, given a point set S and a polygonal curve P
in R^d, we study the problem of finding a polygonal curve Q through S, which
has a minimum Frechet distance to P. Furthermore, if the problem requires that
curve Q visits every point in S, we show it is NP-complete.Comment: arXiv admin note: text overlap with arXiv:1003.0460 by other author
Automated Segmentation of Cells with IHC Membrane Staining
This study presents a fully automated membrane segmentation technique for immunohistochemical tissue images with membrane staining, which is a critical task in computerized immunohistochemistry (IHC). Membrane segmentation is particularly tricky in immunohistochemical tissue images because the cellular membranes are visible only in the stained tracts of the cell, while the unstained tracts are not visible. Our automated method provides accurate segmentation of the cellular membranes in the stained tracts and reconstructs the approximate location of the unstained tracts using nuclear membranes as a spatial reference. Accurate cell-by-cell membrane segmentation allows per cell morphological analysis and quantification of the target membrane proteins that is fundamental in several medical applications such as cancer characterization and classification, personalized therapy design, and for any other applications requiring cell morphology characterization. Experimental results on real datasets from different anatomical locations demonstrate the wide applicability and high accuracy of our approach in the context of IHC analysi
Path Similarity Analysis: a Method for Quantifying Macromolecular Pathways
Diverse classes of proteins function through large-scale conformational
changes; sophisticated enhanced sampling methods have been proposed to generate
these macromolecular transition paths. As such paths are curves in a
high-dimensional space, they have been difficult to compare quantitatively, a
prerequisite to, for instance, assess the quality of different sampling
algorithms. The Path Similarity Analysis (PSA) approach alleviates these
difficulties by utilizing the full information in 3N-dimensional trajectories
in configuration space. PSA employs the Hausdorff or Fr\'echet path
metrics---adopted from computational geometry---enabling us to quantify path
(dis)similarity, while the new concept of a Hausdorff-pair map permits the
extraction of atomic-scale determinants responsible for path differences.
Combined with clustering techniques, PSA facilitates the comparison of many
paths, including collections of transition ensembles. We use the closed-to-open
transition of the enzyme adenylate kinase (AdK)---a commonly used testbed for
the assessment enhanced sampling algorithms---to examine multiple microsecond
equilibrium molecular dynamics (MD) transitions of AdK in its substrate-free
form alongside transition ensembles from the MD-based dynamic importance
sampling (DIMS-MD) and targeted MD (TMD) methods, and a geometrical targeting
algorithm (FRODA). A Hausdorff pairs analysis of these ensembles revealed, for
instance, that differences in DIMS-MD and FRODA paths were mediated by a set of
conserved salt bridges whose charge-charge interactions are fully modeled in
DIMS-MD but not in FRODA. We also demonstrate how existing trajectory analysis
methods relying on pre-defined collective variables, such as native contacts or
geometric quantities, can be used synergistically with PSA, as well as the
application of PSA to more complex systems such as membrane transporter
proteins.Comment: 9 figures, 3 tables in the main manuscript; supplementary information
includes 7 texts (S1 Text - S7 Text) and 11 figures (S1 Fig - S11 Fig) (also
available from journal site
Partial shape matching using CCP map and weighted graph transformation matching
La dĂ©tection de la similaritĂ© ou de la diffĂ©rence entre les images et leur mise en correspondance sont des problĂšmes fondamentaux dans le traitement de l'image. Pour rĂ©soudre ces problĂšmes, on utilise, dans la littĂ©rature, diffĂ©rents algorithmes d'appariement. MalgrĂ© leur nouveautĂ©, ces algorithmes sont pour la plupart inefficaces et ne peuvent pas fonctionner correctement dans les situations dâimages bruitĂ©es. Dans ce mĂ©moire, nous rĂ©solvons la plupart des problĂšmes de ces mĂ©thodes en utilisant un algorithme fiable pour segmenter la carte des contours image, appelĂ©e carte des CCPs, et une nouvelle mĂ©thode d'appariement. Dans notre algorithme, nous utilisons un descripteur local qui est rapide Ă calculer, est invariant aux transformations affines et est fiable pour des objets non rigides et des situations dâoccultation. AprĂšs avoir trouvĂ© le meilleur appariement pour chaque contour, nous devons vĂ©rifier si ces derniers sont correctement appariĂ©s. Pour ce faire, nous utilisons l'approche « Weighted Graph Transformation Matching » (WGTM), qui est capable d'Ă©liminer les appariements aberrants en fonction de leur proximitĂ© et de leurs relations gĂ©omĂ©triques. WGTM fonctionne correctement pour les objets Ă la fois rigides et non rigides et est robuste aux distorsions importantes. Pour Ă©valuer notre mĂ©thode, le jeu de donnĂ©es ETHZ comportant cinq classes diffĂ©rentes d'objets (bouteilles, cygnes, tasses, girafes, logos Apple) est utilisĂ©. Enfin, notre mĂ©thode est comparĂ©e Ă plusieurs mĂ©thodes cĂ©lĂšbres proposĂ©es par d'autres chercheurs dans la littĂ©rature. Bien que notre mĂ©thode donne un rĂ©sultat comparable Ă celui des mĂ©thodes de rĂ©fĂ©rence en termes du rappel et de la prĂ©cision de localisation des frontiĂšres, elle amĂ©liore significativement la prĂ©cision moyenne pour toutes les catĂ©gories du jeu de donnĂ©es ETHZ.Matching and detecting similarity or dissimilarity between images is a fundamental problem in image processing. Different matching algorithms are used in literature to solve this fundamental problem. Despite their novelty, these algorithms are mostly inefficient and cannot perform properly in noisy situations. In this thesis, we solve most of the problems of previous methods by using a reliable algorithm for segmenting image contour map, called CCP Map, and a new matching method. In our algorithm, we use a local shape descriptor that is very fast, invariant to affine transform, and robust for dealing with non-rigid objects and occlusion. After finding the best match for the contours, we need to verify if they are correctly matched. For this matter, we use the Weighted Graph Transformation Matching (WGTM) approach, which is capable of removing outliers based on their adjacency and geometrical relationships. WGTM works properly for both rigid and non-rigid objects and is robust to high order distortions. For evaluating our method, the ETHZ dataset including five diverse classes of objects (bottles, swans, mugs, giraffes, apple-logos) is used. Finally, our method is compared to several famous methods proposed by other researchers in the literature. While our method shows a comparable result to other benchmarks in terms of recall and the precision of boundary localization, it significantly improves the average precision for all of the categories in the ETHZ dataset
- âŠ