1,755 research outputs found

    Speeding up Simplification of Polygonal Curves using Nested Approximations

    Full text link
    We develop a multiresolution approach to the problem of polygonal curve approximation. We show theoretically and experimentally that, if the simplification algorithm A used between any two successive levels of resolution satisfies some conditions, the multiresolution algorithm MR will have a complexity lower than the complexity of A. In particular, we show that if A has a O(N2/K) complexity (the complexity of a reduced search dynamic solution approach), where N and K are respectively the initial and the final number of segments, the complexity of MR is in O(N).We experimentally compare the outcomes of MR with those of the optimal "full search" dynamic programming solution and of classical merge and split approaches. The experimental evaluations confirm the theoretical derivations and show that the proposed approach evaluated on 2D coastal maps either shows a lower complexity or provides polygonal approximations closer to the initial curves.Comment: 12 pages + figure

    Polygonal Representation of Digital Curves

    Get PDF

    Probabilistic convexity measure

    Full text link

    Thinning-free Polygonal Approximation of Thick Digital Curves Using Cellular Envelope

    Get PDF
    Since the inception of successful rasterization of curves and objects in the digital space, several algorithms have been proposed for approximating a given digital curve. All these algorithms, however, resort to thinning as preprocessing before approximating a digital curve with changing thickness. Described in this paper is a novel thinning-free algorithm for polygonal approximation of an arbitrarily thick digital curve, using the concept of "cellular envelope", which is newly introduced in this paper. The cellular envelope, defined as the smallest set of cells containing the given curve, and hence bounded by two tightest (inner and outer) isothetic polygons, is constructed using a combinatorial technique. This envelope, in turn, is analyzed to determine a polygonal approximation of the curve as a sequence of cells using certain attributes of digital straightness. Since a real-world curve=curve-shaped object with varying thickness, unexpected disconnectedness, noisy information, etc., is unsuitable for the existing algorithms on polygonal approximation, the curve is encapsulated by the cellular envelope to enable the polygonal approximation. Owing to the implicit Euclidean-free metrics and combinatorial properties prevailing in the cellular plane, implementation of the proposed algorithm involves primitive integer operations only, leading to fast execution of the algorithm. Experimental results that include output polygons for different values of the approximation parameter corresponding to several real-world digital curves, a couple of measures on the quality of approximation, comparative results related with two other well-referred algorithms, and CPU times, have been presented to demonstrate the elegance and efficacy of the proposed algorithm

    Computational advances in gravitational microlensing: a comparison of CPU, GPU, and parallel, large data codes

    Full text link
    To assess how future progress in gravitational microlensing computation at high optical depth will rely on both hardware and software solutions, we compare a direct inverse ray-shooting code implemented on a graphics processing unit (GPU) with both a widely-used hierarchical tree code on a single-core CPU, and a recent implementation of a parallel tree code suitable for a CPU-based cluster supercomputer. We examine the accuracy of the tree codes through comparison with a direct code over a much wider range of parameter space than has been feasible before. We demonstrate that all three codes present comparable accuracy, and choice of approach depends on considerations relating to the scale and nature of the microlensing problem under investigation. On current hardware, there is little difference in the processing speed of the single-core CPU tree code and the GPU direct code, however the recent plateau in single-core CPU speeds means the existing tree code is no longer able to take advantage of Moore's law-like increases in processing speed. Instead, we anticipate a rapid increase in GPU capabilities in the next few years, which is advantageous to the direct code. We suggest that progress in other areas of astrophysical computation may benefit from a transition to GPUs through the use of "brute force" algorithms, rather than attempting to port the current best solution directly to a GPU language -- for certain classes of problems, the simple implementation on GPUs may already be no worse than an optimised single-core CPU version.Comment: 11 pages, 4 figures, accepted for publication in New Astronom

    A new thresholding approach for automatic generation of polygonal approximations

    Get PDF
    The present paper proposes a new algorithm for automatic generation of polygonal approximations of 2D closed contours based on a new thresholding method. The new proposal computes the signi cance level of the contour points using a new symmetric version of the well-known Ramer, Douglas - Peucker method, and then a new Adaptive method is applied to threshold the normalized signi cance level of the contour points to generate the polygonal approximation. The experiments have shown that the new algorithm has good performance for generating polygonal approximations of 2D closed contours. Futhermore, the new algorithm does not require any parameter to be tuned

    Contribuciones sobre métodos óptimos y subóptimos de aproximaciones poligonales de curvas 2-D

    Get PDF
    Esta tesis versa sobre el an álisis de la forma de objetos 2D. En visión articial existen numerosos aspectos de los que se pueden extraer información. Uno de los más usados es la forma o el contorno de esos objetos. Esta característica visual de los objetos nos permite, mediante el procesamiento adecuado, extraer información de los objetos, analizar escenas, etc. No obstante el contorno o silueta de los objetos contiene información redundante. Este exceso de datos que no aporta nuevo conocimiento debe ser eliminado, con el objeto de agilizar el procesamiento posterior o de minimizar el tamaño de la representación de ese contorno, para su almacenamiento o transmisión. Esta reducción de datos debe realizarse sin que se produzca una pérdida de información importante para representación del contorno original. Se puede obtener una versión reducida de un contorno eliminando puntos intermedios y uniendo los puntos restantes mediante segmentos. Esta representación reducida de un contorno se conoce como aproximación poligonal. Estas aproximaciones poligonales de contornos representan, por tanto, una versión comprimida de la información original. El principal uso de las mismas es la reducción del volumen de información necesario para representar el contorno de un objeto. No obstante, en los últimos años estas aproximaciones han sido usadas para el reconocimiento de objetos. Para ello los algoritmos de aproximaci ón poligonal se han usado directamente para la extracci ón de los vectores de caracter ísticas empleados en la fase de aprendizaje. Las contribuciones realizadas por tanto en esta tesis se han centrado en diversos aspectos de las aproximaciones poligonales. En la primera contribución se han mejorado varios algoritmos de aproximaciones poligonales, mediante el uso de una fase de preprocesado que acelera estos algoritmos permitiendo incluso mejorar la calidad de las soluciones en un menor tiempo. En la segunda contribución se ha propuesto un nuevo algoritmo de aproximaciones poligonales que obtiene soluciones optimas en un menor espacio de tiempo que el resto de métodos que aparecen en la literatura. En la tercera contribución se ha propuesto un algoritmo de aproximaciones que es capaz de obtener la solución óptima en pocas iteraciones en la mayor parte de los casos. Por último, se ha propuesto una versi ón mejorada del algoritmo óptimo para obtener aproximaciones poligonales que soluciona otro problema de optimización alternativo.This thesis focus on the analysis of the shape of objects. In computer vision there are several sources from which we can extract information. One of the most important source of information is the shape or contour of objects. This visual characteristic can be used to extract information, analyze the scene, etc. However, the contour of the objects contains redundant information. This redundant data does not add new information and therefore, must be deleted in order to minimize the processing burden and reducing the amount of data to represent that shape. This reduction of data should be done without losing important information to represent the original contour. A reduced version of a contour can be obtained by deleting some points of the contour and linking the remaining points by using line segments. This reduced version of a contour is known as polygonal approximation in the literature. Therefore, these polygonal approximation represent a compressed version of the original information. The main use of polygonal approximations is to reduce the amount of information needed to represent the contour of an object. However, in recent years polygonal approximations have been used to recognize objects. For this purpose, the feature vectors have been extracted from the polygonal approximations. The contributions proposed in this thesis have focused on several aspects of polygonal approximations. The rst contribution has improved several algorithms to obtain polygonal approximations, by adding a new stage of preprocessing which boost the whole method. The quality of the solutions obtained has also been improved and the computation time reduced. The second contribution proposes a novel algorithm which obtains optimal polygonal approximations in a shorter time than the optimal methods found in the literature. The third contribution proposes a new method which may obtain the optimal solution after few iterations in most cases. Finally, an improved version of the optimal polygonal approximation algorithm has been proposed to solve an alternative optimization problem

    Multiorder polygonal approximation of digital curves

    Get PDF
    In this paper, we propose a quick threshold-free algorithm, which computes the angular shape of a 2D object from the points of its contour. For that, we have extended the method defined in [4, 5] to a multiorder analysis. It is based on the arithmetical definition of discrete lines [11] with variable thickness. We provide a framework to analyse a digital curve at different levels of thickness. The extremities of a segment provided at a high resolution are tracked at lower resolution in order to refine their location. The method is thresholdfree and automatically provides a partitioning of a digital curve into its meaningful parts
    corecore