1,340 research outputs found

    On Point Spread Function modelling: towards optimal interpolation

    Get PDF
    Point Spread Function (PSF) modeling is a central part of any astronomy data analysis relying on measuring the shapes of objects. It is especially crucial for weak gravitational lensing, in order to beat down systematics and allow one to reach the full potential of weak lensing in measuring dark energy. A PSF modeling pipeline is made of two main steps: the first one is to assess its shape on stars, and the second is to interpolate it at any desired position (usually galaxies). We focus on the second part, and compare different interpolation schemes, including polynomial interpolation, radial basis functions, Delaunay triangulation and Kriging. For that purpose, we develop simulations of PSF fields, in which stars are built from a set of basis functions defined from a Principal Components Analysis of a real ground-based image. We find that Kriging gives the most reliable interpolation, significantly better than the traditionally used polynomial interpolation. We also note that although a Kriging interpolation on individual images is enough to control systematics at the level necessary for current weak lensing surveys, more elaborate techniques will have to be developed to reach future ambitious surveys' requirements.Comment: Accepted for publication in MNRA

    基于重心Delaunay三角剖分的蓝噪声点采样算法

    Get PDF
    为了生成带有高质量蓝噪声性质的采样分布,提出一种基于重心Delaunay三角剖分的点采样算法.该算法将Delaunay三角剖分与1-邻域三角片重心相结合,迭代地将每个采样点移至其1-邻域三角片的重心处并更新采样点之间的拓扑连接关系;重心通过给定的密度函数计算得出.实验结果表明,本文算法在运行效率与鲁棒性方面均有一定优势.国家自然科学基金(61472332);;福建省自然科学基金(2018J01104

    Contact based void partitioning to assess filtration properties in DEM simulations

    Get PDF
    Discrete element method (DEM) simulations model the behaviour of a granular material by explicitly considering the individual particles. In principle, DEM analyses then provide a means to relate particle scale mechanisms with the overall, macro-scale response. However, interpretative algorithms must be applied to gain useful scientific insight using the very large amount of data available from DEM simulations. The particle and contact coordinates as well as the contact orientations can be directly obtained from a DEM simulation and the application of measures such as the coordination number and the fabric tensor to describe these data is now well-established. However, a granular material has two phases and a full description of the material also requires consideration of the voids. Quantitative analysis of the void space can give further insight into directional fabric and is also useful in assessing the filtration characteristics of a granular material. The void topology is not directly given by the DEM simulation data; rather it must be inferred from the geometry of particle phase. The current study considers the use of the contact coordinates to partition the void space for 3D DEM simulation datasets and to define individual voids as well as the boundaries or constrictions between the voids. The measured constriction sizes are comparable to those calculated using Delaunay-triangulation based methods, and the contact-based method has the advantage of being less subjective. In an example application, the method was applied to DEM models of reservoir sandstones to establish the relationship between particle and constriction sizes as well as the relationship between the void topology and the coordination number and the evolution of these properties during shearing

    Gap Processing for Adaptive Maximal Poisson-Disk Sampling

    Full text link
    In this paper, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or have their radius changed. We build on the concepts of the regular triangulation and the power diagram. Third, we will show how our analysis can make a contribution to the state-of-the-art in surface remeshing.Comment: 16 pages. ACM Transactions on Graphics, 201

    A Systematic Review of Algorithms with Linear-time Behaviour to Generate Delaunay and Voronoi Tessellations

    Get PDF
    Triangulations and tetrahedrizations are important geometrical discretization procedures applied to several areas, such as the reconstruction of surfaces and data visualization. Delaunay and Voronoi tessellations are discretization structures of domains with desirable geometrical properties. In this work, a systematic review of algorithms with linear-time behaviour to generate 2D/3D Delaunay and/or Voronoi tessellations is presented

    Localization in Unstructured Environments: Towards Autonomous Robots in Forests with Delaunay Triangulation

    Full text link
    Autonomous harvesting and transportation is a long-term goal of the forest industry. One of the main challenges is the accurate localization of both vehicles and trees in a forest. Forests are unstructured environments where it is difficult to find a group of significant landmarks for current fast feature-based place recognition algorithms. This paper proposes a novel approach where local observations are matched to a general tree map using the Delaunay triangularization as the representation format. Instead of point cloud based matching methods, we utilize a topology-based method. First, tree trunk positions are registered at a prior run done by a forest harvester. Second, the resulting map is Delaunay triangularized. Third, a local submap of the autonomous robot is registered, triangularized and matched using triangular similarity maximization to estimate the position of the robot. We test our method on a dataset accumulated from a forestry site at Lieksa, Finland. A total length of 2100\,m of harvester path was recorded by an industrial harvester with a 3D laser scanner and a geolocation unit fixed to the frame. Our experiments show a 12\,cm s.t.d. in the location accuracy and with real-time data processing for speeds not exceeding 0.5\,m/s. The accuracy and speed limit is realistic during forest operations

    Modellierung der Zugänglichkeit zu öffentlichen Verkehrsmitteln auf der Grundlage von Raumbewegungsdaten

    Get PDF
    The thesis serves three objectives: 1) exploration of biking distances at individual transit stations from trajectory and smart card data, 2) investigation of transit catchment area to raise the public awareness of the transit accessibility at a general level, and 3) inspection of accessibility constrained by crowdedness at a fine-grained level.Die Dissertation hat drei Ziele: 1) Untersuchung der Fahrraddistanzen an den einzelnen Transitstationen anhand von Trajektorien- und Smartcard-Daten, 2) Untersuchung des Transit-Einzugsgebietes zur Sensibilisierung der Öffentlichkeit für die Zugänglichkeit des Transits auf allgemeiner Ebene und 3) Untersuchung der durch Überfüllung eingeschränkten Zugänglichkeit auf Detailebene

    On point spread function modelling: towards optimal interpolation

    Get PDF
    Point spread function (PSF) modelling is a central part of any astronomy data analysis relying on measuring the shapes of objects. It is especially crucial for weak gravitational lensing, in order to beat down systematics and allow one to reach the full potential of weak lensing in measuring dark energy. A PSF modelling pipeline is made of two main steps: the first one is to assess its shape on stars, and the second is to interpolate it at any desired position (usually galaxies). We focus on the second part, and compare different interpolation schemes, including polynomial interpolation, radial basis functions, Delaunay triangulation and Kriging. For that purpose, we develop simulations of PSF fields, in which stars are built from a set of basis functions defined from a principal components analysis of a real ground-based image. We find that Kriging gives the most reliable interpolation, significantly better than the traditionally used polynomial interpolation. We also note that although a Kriging interpolation on individual images is enough to control systematics at the level necessary for current weak lensing surveys, more elaborate techniques will have to be developed to reach future ambitious surveys' requirement

    Ground states in the Many Interacting Worlds approach

    Full text link
    Recently the Many-Interacting-Worlds (MIW) approach to a quantum theory without wave functions was proposed. This approach leads quite naturally to numerical integrators of the Schr\"odinger equation. It has been suggested that such integrators may feature advantages over fixed-grid methods for higher numbers of degrees of freedom. However, as yet, little is known about concrete MIW models for more than one spatial dimension and/or more than one particle. In this work we develop the MIW approach further to treat arbitrary degrees of freedom, and provide a systematic study of a corresponding numerical implementation for computing one-particle ground and excited states in one dimension, and ground states in two spatial dimensions. With this step towards the treatment of higher degrees of freedom we hope to stimulate their further study.Comment: 16 pages, 8 figure

    Efficient Generating And Processing Of Large-Scale Unstructured Meshes

    Get PDF
    Unstructured meshes are used in a variety of disciplines to represent simulations and experimental data. Scientists who want to increase accuracy of simulations by increasing resolution must also increase the size of the resulting dataset. However, generating and processing a extremely large unstructured meshes remains a barrier. Researchers have published many parallel Delaunay triangulation (DT) algorithms, often focusing on partitioning the initial mesh domain, so that each rectangular partition can be triangulated in parallel. However, the comproblems for this method is how to merge all triangulated partitions into a single domain-wide mesh or the significant cost for communication the sub-region borders. We devised a novel algorithm --Triangulation of Independent Partitions in Parallel (TIPP) to deal with very large DT problems without requiring inter-processor communication while still guaranteeing the Delaunay criteria. The core of the algorithm is to find a set of independent} partitions such that the circumcircles of triangles in one partition do not enclose any vertex in other partitions. For this reason, this set of independent partitions can be triangulated in parallel without affecting each other. The results of mesh generation is the large unstructured meshes including vertex index and vertex coordinate files which introduce a new challenge \-- locality. Partitioning unstructured meshes to improve locality is a key part of our own approach. Elements that were widely scattered in the original dataset are grouped together, speeding data access. For further improve unstructured mesh partitioning, we also described our new approach. Direct Load which mitigates the challenges of unstructured meshes by maximizing the proportion of useful data retrieved during each read from disk, which in turn reduces the total number of read operations, boosting performance
    corecore