112 research outputs found

    Computing the Fréchet distance between uncertain curves in one dimension.

    Get PDF
    We consider the problem of computing the Fréchet distance between two curves for which the exact locations of the vertices are unknown. Each vertex may be placed in a given uncertainty region for that vertex, and the objective is to place vertices so as to minimise the Fréchet distance. This problem was recently shown to be NP-hard in 2D, and it is unclear how to compute an optimal vertex placement at all. We present the first general algorithmic framework for this problem. We prove that it results in a polynomial-time algorithm for curves in 1D with intervals as uncertainty regions. In contrast, we show that the problem is NP-hard in 1D in the case that vertices are placed to maximise the Fréchet distance. We also study the weak Fréchet distance between uncertain curves. While finding the optimal placement of vertices seems more difficult than the regular Fréchet distance—and indeed we can easily prove that the problem is NP-hard in 2D—the optimal placement of vertices in 1D can be computed in polynomial time. Finally, we investigate the discrete weak Fréchet distance, for which, somewhat surprisingly, the problem is NP-hard already in 1D

    Fréchet Distance for Uncertain Curves

    Get PDF
    In this article, we study a wide range of variants for computing the (discrete and continuous) Fréchet distance between uncertain curves. An uncertain curve is a sequence of uncertainty regions, where each region is a disk, a line segment, or a set of points. A realisation of a curve is a polyline connecting one point from each region. Given an uncertain curve and a second (certain or uncertain) curve, we seek to compute the lower and upper bound Fréchet distance, which are the minimum and maximum Fréchet distance for any realisations of the curves. We prove that both problems are NP-hard for the Fréchet distance in several uncertainty models, and that the upper bound problem remains hard for the discrete Fréchet distance. In contrast, the lower bound (discrete [5] and continuous) Fréchet distance can be computed in polynomial time in some models. Furthermore, we show that computing the expected (discrete and continuous) Fréchet distance is #P-hard in some models.On the positive side, we present an FPTAS in constant dimension for the lower bound problem when Δ/δis polynomially bounded, where δis the Fréchet distance and Δbounds the diameter of the regions. We also show a near-linear-time 3-approximation for the decision problem on roughly δ-separated convex regions. Finally, we study the setting with Sakoe-Chiba time bands, where we restrict the alignment between the curves, and give polynomial-time algorithms for the upper bound and expected discrete and continuous Fréchet distance for uncertainty modelled as point sets.</p

    Fr\'echet Distance for Uncertain Curves

    Full text link
    In this paper we study a wide range of variants for computing the (discrete and continuous) Fr\'echet distance between uncertain curves. We define an uncertain curve as a sequence of uncertainty regions, where each region is a disk, a line segment, or a set of points. A realisation of a curve is a polyline connecting one point from each region. Given an uncertain curve and a second (certain or uncertain) curve, we seek to compute the lower and upper bound Fr\'echet distance, which are the minimum and maximum Fr\'echet distance for any realisations of the curves. We prove that both the upper and lower bound problems are NP-hard for the continuous Fr\'echet distance in several uncertainty models, and that the upper bound problem remains hard for the discrete Fr\'echet distance. In contrast, the lower bound (discrete and continuous) Fr\'echet distance can be computed in polynomial time. Furthermore, we show that computing the expected discrete Fr\'echet distance is #P-hard when the uncertainty regions are modelled as point sets or line segments. The construction also extends to show #P-hardness for computing the continuous Fr\'echet distance when regions are modelled as point sets. On the positive side, we argue that in any constant dimension there is a FPTAS for the lower bound problem when Δ/δ\Delta / \delta is polynomially bounded, where δ\delta is the Fr\'echet distance and Δ\Delta bounds the diameter of the regions. We then argue there is a near-linear-time 3-approximation for the decision problem when the regions are convex and roughly δ\delta-separated. Finally, we also study the setting with Sakoe--Chiba time bands, where we restrict the alignment between the two curves, and give polynomial-time algorithms for upper bound and expected discrete and continuous Fr\'echet distance for uncertainty regions modelled as point sets.Comment: 48 pages, 11 figures. This is the full version of the paper to be published in ICALP 202

    Algorithms for Context-Aware Trajectory Analysis

    Get PDF

    09111 Abstracts Collection -- Computational Geometry

    Get PDF
    From March 8 to March 13, 2009, the Dagstuhl Seminar 09111 ``Computational Geometry \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Algorithms for map-aided autonomous indoor pedestrian positioning and navigation

    Get PDF
    The personal positioning and navigation became a very challenging topic in our dynamic time. The urban canyons and particularly indoors represent the most difficult areas for personal navigation problematic. Problems like disturbed satellite signals make the positioning impossible indoors. Recently developed systems for indoor positioning do not assure the necessary positioning accuracy or are very expensive. Our concept stands for a fully autonomous positioning and navigation process. That is, a method that does not rely on the reception of external information, like satellite or terrestrial signals. Therefore, this research is based on the use of inertial measurements of the human walk and the map database which contains the graphic representation of the elements of the building, created by applying the link-node model. Using this reduced set of information the task is to develop methodology, based on the interaction of the data from both sources, to assure reliable positioning and navigation process. This research is divided in three parts. The first part consists in the development of a methodology for initial localization of the person indoors. The problem to solve is to localize the person in the building. Consider a person equipped with a system which contains set of inertial sensors and map database of the building. Speed, turn rate and barometric altitude are measured and time-stamped on each step of the person. A pre-processing phase uses these raw measurements in order to construct a polyline, thus representing user's trajectory. In the localization approach central place takes the association of the user's trajectory with the graph representation of the building, process known as map-matching. The solution is based on statistical method where the determination of the user's position is entirely represented by its probability density function (PDF) in the frame of Bayesian inference. Initial localization determines the edge of the graph occupied by the person. The second part aims at continuous localization, where user's position is estimated on every step. Besides the application of the classical map-matching techniques, two new methods are developed. Both rely on the similarity of the geometry of the trajectory and the elements of the graph. The first is based on the Bayesian inference, where the estimation is computed considering the walked distance and azimuth. The second method represents a new application of the Fréchet distance as degree of similarity between two polylines. The third part is pointed at the pedestrian guidance. Once the user's position is known it is easy to compute the path to his destination and to give him directions. The problem is to assure continuance of the process of navigation in the case when the person has lost his path. In that case the solution consists in either giving instructions to the user to go back on the path or computation of a new path from the actual position of the user to his destination. Based on that methodology, algorithms for initial localization, continuous localization, and guidance were created. Numerous tests with the participation of several persons have been provided in order to validate the algorithms and to show their performance, robustness and limits

    SENS: Sketch-based Implicit Neural Shape Modeling

    Full text link
    We present SENS, a novel method for generating and editing 3D models from hand-drawn sketches, including those of an abstract nature. Our method allows users to quickly and easily sketch a shape, and then maps the sketch into the latent space of a part-aware neural implicit shape architecture. SENS analyzes the sketch and encodes its parts into ViT patch encoding, then feeds them into a transformer decoder that converts them to shape embeddings, suitable for editing 3D neural implicit shapes. SENS not only provides intuitive sketch-based generation and editing, but also excels in capturing the intent of the user's sketch to generate a variety of novel and expressive 3D shapes, even from abstract sketches. We demonstrate the effectiveness of our model compared to the state-of-the-art using objective metric evaluation criteria and a decisive user study, both indicating strong performance on sketches with a medium level of abstraction. Furthermore, we showcase its intuitive sketch-based shape editing capabilities.Comment: 18 pages, 18 figure

    Registration of histology and magnetic resonance imaging of the brain

    Get PDF
    Combining histology and non-invasive imaging has been attracting the attention of the medical imaging community for a long time, due to its potential to correlate macroscopic information with the underlying microscopic properties of tissues. Histology is an invasive procedure that disrupts the spatial arrangement of the tissue components but enables visualisation and characterisation at a cellular level. In contrast, macroscopic imaging allows non-invasive acquisition of volumetric information but does not provide any microscopic details. Through the establishment of spatial correspondences obtained via image registration, it is possible to compare micro- and macroscopic information and to recover the original histological arrangement in three dimensions. In this thesis, I present: (i) a survey of the literature relative to methods for histology reconstruction with and without the help of 3D medical imaging; (ii) a graph-theoretic method for histology volume reconstruction from sets of 2D sections, without external information; (iii) a method for multimodal 2D linear registration between histology and MRI based on partial matching of shape-informative boundaries

    Recalage préservant la topologie des vaisseaux: application à la cardiologie interventionnelle

    Get PDF
    In percutaneous coronary interventions, integrating into the live fluoroscopic image vessel calcifications and occlusion information that are revealed in the pre-operative Computed Tomography Angiography can greatly improve guidance of the clinician. Fusing pre- and intra-operative information into a single space aims at taking advantage of two complementary modalities and requires a step of registration that must provide good alignment and relevant correspondences between them. Most of the existing 3D/2D vessel registration algorithms do not take into account the particular topology of the vasculature to be matched, resulting into pairings that may be topologically inconsistent along the vasculature.A first contribution consisted in a registration framework dedicated to curve matching, denoted the Iterative Closest Curve (ICC). Its main feature is to preserve the topological consistency along curves by taking advantage of the Frechet distance that not only computes the distance between two curves but also builds ordered pairings along them. A second contribution is a pairing procedure designed for the matching of a vascular tree structure that endorses its particular topology and that can easily take advantage of the ICC-framework. Centerlines of the 3D tree are matched to curves extracted from the 2D vascular graph while preserving the connectivity at 3D bifurcations. The matching criterion used to build the pairings takes into account the geometric distance and the resemblance between curves both based on a global formulation using the Frechet distance.To evaluate our approach we run experiments on a database composed of 63 clinical cases, measuring accuracy on real conditions and robustness with respect to a simulated displacement. Quantitative results have been obtained using two complementary measures that aim at assessing the results both geometrically and topologically, and quantify the resulting alignment error as well as the pairing error. The proposed method exhibits good results both in terms of pairing and alignment and demonstrates to be low sensitive to the rotations to be compensated (up to 30 degrees).Cette thèse s’inscrit dans le cadre de la cardiologie interventionnelle. Intégrer des informations telles que la position des calcifications ainsi que la taille et forme d’une occlusion dans les images fluoroscopiques constituerait un bénéfice pour le praticien. Ces informations, invisibles dans les images rayons-X pendant la procédure, sont présentes au sein du scanner CT préopératoire. La fusion de cette modalité avec la fluoroscopie apporterait une aide précieuse au guidage temps réel des outils interventionnels en bénéficiant des informations fournies par le CT. Cette fusion requiert une étape de recalage qui vise à aligner au mieux les deux modalités et fournir des correspondances pertinentes entre elles. La plupart des algorithmes de recalage 3D/2D de vaisseaux rencontrent des difficultés à construire des appariements anatomiquement pertinents, essentiellement à cause du manque de cohérence topologique le long du réseau vasculaire.Afin de résoudre ce problème, nous proposons dans cette thèse un cadre générique pour le recalage de structures curvilinéaires. L’algorithme qui en découle préserve la structure des courbes appariées. Les artères coronaires pouvant être représentées par un ensemble de courbes arrangées en arbre, nous proposons aussi une procédure d’appariement qui respecte cette structure. Le recalage d’un arbre 3D sur un graphe 2D est ainsi réalisé en assurant la préservation des connectivités aux bifurcations. Le choix de l’appariement est basé sur un critère prenant en compte la distance géométrique ainsi que la ressemblance entre courbes. Ce critère est évalué grâce à une forme modifiée de la distance de Fréchet.Une base de données de 63 cas cliniques a été utilisée à travers différentes expériences afin de prouver la robustesse et la précision de notre approche. Nous avons proposé deux mesures complémentaires visant à quantifier la qualité de l’alignement d’une part et des appariements engendrés d’autre part. La méthode proposée se montre précise pour les alignements de la projection du modèle CT et des artères coronaires observées dans les images angiographiques. De plus, les appariements obtenus sont anatomiquement pertinents et lálgorithme a prouvé sa robustesse face aux perturbations de la position initiale. Nous attribuons cette robustesse à la qualité des appariements construits au fur et à mesure des itérations

    Fundamentals

    Get PDF
    Volume 1 establishes the foundations of this new field. It goes through all the steps from data collection, their summary and clustering, to different aspects of resource-aware learning, i.e., hardware, memory, energy, and communication awareness. Machine learning methods are inspected with respect to resource requirements and how to enhance scalability on diverse computing architectures ranging from embedded systems to large computing clusters
    • …
    corecore