43 research outputs found

    Fast B-spline Curve Fitting by L-BFGS

    Full text link
    We propose a novel method for fitting planar B-spline curves to unorganized data points. In traditional methods, optimization of control points and foot points are performed in two very time-consuming steps in each iteration: 1) control points are updated by setting up and solving a linear system of equations; and 2) foot points are computed by projecting each data point onto a B-spline curve. Our method uses the L-BFGS optimization method to optimize control points and foot points simultaneously and therefore it does not need to perform either matrix computation or foot point projection in every iteration. As a result, our method is much faster than existing methods

    Reconstruction of moving surfaces of revolution from sparse 3-D measurements using a stereo camera and structured light

    Get PDF
    Das Ziel dieser Arbeit ist die Entwicklung und Analyse der algorithmischen Methodik zur Rekonstruktion eines parametrischen Oberflächenmodells für ein rotationssymmetrisches Objekt aus einer Sequenz von dünnen 3D-Punktwolken. Dabei kommt ein neuartiges Messsystem mit großem Sichtfeld zum Einsatz, das auch in schwierigen Bedingungen eingesetzt werden kann. Das zu vermessende Objekt kann während der Aufnahme der Sequenz einer als analytisches Modell formulierbaren Bewegung unterliegen. Das Verfahren wird anhand einer praktischen Anwendung zur Oberflächenrückgewinnung eines Rades analysiert und entwickelt. Es wird gezeigt, dass die durch Fit eines einfachen Models für jede Einzelmessung erzielbare Genauigkeit durch Anpassung eines globalen Modells unter gleichzeitiger Einbeziehung aller Einzelmessungen und unter Berücksichtigung eines geeigneten Bewegungsmodells erheblich verbessert werden kann. Die Gewinnung der dreidimensionalen Punktdaten erfolgt mit einem Stereokamerasystem in Verbindung mit aktiver Beleuchtung in Form eines Punktmusters. Eine relativ hohe Punktdichte im gesamten Sichtfeld des Stereokamerasystems wird durch Verbindung mehrerer Laserprojektoren zu einer Projektionseinheit erzielt. Durch exakte Kalibrierung des Kamerasystems und der Projektionseinheit wird trotz großer Streuung der Laserpunkte im Kamerabild unter Ausnutzung der trifokalen geometrischen Bedingungen eine hohe Genauigkeit in den dreidimensionalen Punktdaten erzielt

    A Universal Parametrization in B-Spline Curve and Surface Interpolation and Its Performance Evaluation.

    Get PDF
    The choice of a proper parametrization method is critical in curve and surface fitting using parametric B-splines. Conventional parametrization methods do not work well partly because they are based only on the geometric properties of given data points such as the distances between consecutive data points and the angles between consecutive line segments. The resulting interpolation curves don\u27t look natural and they are often not affine invariant. The conventional parametrization methods don\u27t work well for odd orders k. If a data point is altered, the effect is not limited locally at all with these methods. The localness property with respect to data points is critical in interactive modeling. We present a new parametrization based on the nature of the basis functions called B-splines. It assigns to each data point the parameter value at which the corresponding B-spline N\sb{ik}(t) is maximum. The new method overcomes all four problems mentioned above; (1) It works well for all orders k, (2) it generates affine invariant curves, (3) the resulting curves look more natural, in general, and (4) it has the semi-localness property with respect to data points. The new method is also computationally more efficient and the resulting curve has more regular behavior of the curvature. Fairness evaluation and knot removal are performed on curves obtained from various parametrizations. The results also show that the new parametrization is superior. Fairness is evaluated in terms of total curvature, total length, and curvature plot. The curvature plots are looking natural for the curves obtained from the new parametrization. For the curves obtained from the new method, knot removal is able to provide with the curves which are very close to the original curves. A more efficient and effective method is also presented for knot removal in B-spline curve. A global norm is utilized for approximation unlike other methods which are using some local norms. A geometrical view makes the computation more efficient

    Automated Morphology Analysis of Nanoparticles

    Get PDF
    The functional properties of nanoparticles highly depend on the surface morphology of the particles, so precise measurements of a particle's morphology enable reliable characterizing of the nanoparticle's properties. Obtaining the measurements requires image analysis of electron microscopic pictures of nanoparticles. Today's labor-intensive image analysis of electron micrographs of nanoparticles is a significant bottleneck for efficient material characterization. The objective of this dissertation is to develop automated morphology analysis methods. Morphology analysis is comprised of three tasks: separate individual particles from an agglomerate of overlapping nano-objects (image segmentation); infer the particle's missing contours (shape inference); and ultimately, classify the particles by shape based on their complete contours (shape classification). Two approaches are proposed in this dissertation: the divide-and-conquer approach and the convex shape analysis approach. The divide-and-conquer approach solves each task separately, taking less than one minute to complete the required analysis, even for the largest-sized micrograph. However, its separating capability of particle overlaps is limited, meaning that it is able to split only touching particles. The convex shape analysis approach solves shape inference and classification simultaneously for better accuracy, but it requires more computation time, ten minutes for the biggest-sized electron micrograph. However, with a little sacrifice of time efficiency, the second approach achieves far superior separation than the divide-and-conquer approach, and it handles the chain-linked structure of particle overlaps well. The capabilities of the two proposed methods cannot be substituted by generic image processing and bio-imaging methods. This is due to the unique features that the electron microscopic pictures of nanoparticles have, including special particle overlap structures, and large number of particles to be processed. The application of the proposed methods to real electron microscopic pictures showed that the two proposed methods were more capable of extracting the morphology information than the state-of-the-art methods. When nanoparticles do not have many overlaps, the divide-and-conquer approach performed adequately. When nanoparticles have many overlaps, forming chain-linked clusters, the convex shape analysis approach performed much better than the state-of-the-art alternatives in bio-imaging. The author believes that the capabilities of the proposed methods expedite the morphology characterization process of nanoparticles. The author further conjectures that the technical generality of the proposed methods could even be a competent alternative to the current methods analyzing general overlapping convex-shaped objects other than nanoparticles

    Theory and applications of bijective barycentric mappings

    Get PDF
    Barycentric coordinates provide a convenient way to represent a point inside a triangle as a convex combination of the triangle's vertices, and to linearly interpolate data given at these vertices. Due to their favourable properties, they are commonly applied in geometric modelling, finite element methods, computer graphics, and many other fields. In some of these applications it is desirable to extend the concept of barycentric coordinates from triangles to polygons. Several variants of such generalized barycentric coordinates have been proposed in recent years. An important application of barycentric coordinates consists of barycentric mappings, which allow to naturally warp a source polygon to a corresponding target polygon, or more generally, to create mappings between closed curves or polyhedra. The principal practical application is image warping, which takes as input a control polygon drawn around an image and smoothly warps the image by moving the polygon vertices. A required property of image warping is to avoid fold-overs in the resulting image. The problem of fold-overs is a manifestation of a larger problem related to the lack of bijectivity of the barycentric mapping. Unfortunately, bijectivity of such barycentric mappings can only be guaranteed for the special case of warping between convex polygons or by triangulating the domain and hence renouncing smoothness. In fact, for any barycentric coordinates, it is always possible to construct a pair of polygons such that the barycentric mapping is not bijective. In the first part of this thesis we illustrate three methods to achieve bijective mappings. The first method is based on the intuition that, if two polygons are sufficiently close, then the mapping is close to the identity and hence bijective. This suggests to ``split'' the mapping into several intermediate mappings and to create a composite barycentric mapping which is guaranteed to be bijective between arbitrary polygons, polyhedra, or closed planar curves. We provide theoretical bounds on the bijectivity of the composite mapping related to the norm of the gradient of the coordinates. The fact that the bound depends on the gradient implies that these bounds exist only if the gradient of the coordinates is bounded. We focus on mean value coordinates and analyse the behaviour of their directional derivatives and gradient at the vertices of a polygon. The composition of barycentric mappings for closed planar curves leads to the problem of blending between two planar curves. We suggest to solve it by linearly interpolating the signed curvature and then reconstructing the intermediate curve from the interpolated curvature values. However, when both input curves are closed, this strategy can lead to open intermediate curves. We present a new algorithm for solving this problem, which finds the closed curve whose curvature is closest to the interpolated values. Our method relies on the definition of a suitable metric for measuring the distance between two planar curves and an appropriate discretization of the signed curvature functions. The second method to construct smooth bijective mappings with prescribed behaviour along the domain boundary exploits the properties of harmonic maps. These maps can be approximated in different ways, and we discuss their respective advantages and disadvantages. We further present a simple procedure for reducing their distortion and demonstrate the effectiveness of our approach by providing examples. The last method relies on a reformulation of complex barycentric mappings, which allows us to modify the ``speed'' along the edges to create complex bijective mappings. We provide some initial results and an optimization procedure which creates complex bijective maps. In the second part we provide two main applications of bijective mapping. The first one is in the context of finite elements simulations, where the discretization of the computational domain plays a central role. In the standard discretization, the domain is triangulated with a mesh and its boundary is approximated by a polygon. We present an approach which combines parametric finite elements with smooth bijective mappings, leaving the choice of approximation spaces free. This approach allows to represent arbitrarily complex geometries on coarse meshes with curved edges, regardless of the domain boundary complexity. The main idea is to use a bijective mapping for automatically warping the volume of a simple parametrization domain to the complex computational domain, thus creating a curved mesh of the latter. The second application addresses the meshing problem and the possibility to solve finite element simulations on polygonal meshes. In this context we present several methods to discretize the bijective mapping to create polygonal and piece-wise polynomial meshes

    Fairing-PIA: Progressive iterative approximation for fairing curve and surface generation

    Full text link
    The fairing curves and surfaces are used extensively in geometric design, modeling, and industrial manufacturing. However, the majority of conventional fairing approaches, which lack sufficient parameters to improve fairness, are based on energy minimization problems. In this study, we develop a novel progressive-iterative approximation method for fairing curve and surface generation (fairing-PIA). Fairing-PIA is an iteration method that can generate a series of curves (surfaces) by adjusting the control points of B-spline curves (surfaces). In fairing-PIA, each control point is endowed with an individual weight. Thus, the fairing-PIA has many parameters to optimize the shapes of curves and surfaces. Not only a fairing curve (surface) can be generated globally through fairing-PIA, but also the curve (surface) can be improved locally. Moreover, we prove the convergence of the developed fairing-PIA and show that the conventional energy minimization fairing model is a special case of fairing-PIA. Finally, numerical examples indicate that the proposed method is effective and efficient.Comment: 21 pages, 10 figure

    Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age

    Get PDF
    Simultaneous Localization and Mapping (SLAM)consists in the concurrent construction of a model of the environment (the map), and the estimation of the state of the robot moving within it. The SLAM community has made astonishing progress over the last 30 years, enabling large-scale real-world applications, and witnessing a steady transition of this technology to industry. We survey the current state of SLAM. We start by presenting what is now the de-facto standard formulation for SLAM. We then review related work, covering a broad set of topics including robustness and scalability in long-term mapping, metric and semantic representations for mapping, theoretical performance guarantees, active SLAM and exploration, and other new frontiers. This paper simultaneously serves as a position paper and tutorial to those who are users of SLAM. By looking at the published research with a critical eye, we delineate open challenges and new research issues, that still deserve careful scientific investigation. The paper also contains the authors' take on two questions that often animate discussions during robotics conferences: Do robots need SLAM? and Is SLAM solved
    corecore