49 research outputs found

    Volume MLS Ray Casting

    Full text link

    Monte Carlo simulations of random non-commutative geometries

    Get PDF
    Random non-commutative geometries are introduced by integrating over the space of Dirac operators that form a spectral triple with a fixed algebra and Hilbert space. The cases with the simplest types of Clifford algebra are investigated using Monte Carlo simulations to compute the integrals. Various qualitatively different types of behaviour of these random Dirac operators are exhibited. Some features are explained in terms of the theory of random matrices but other phenomena remain mysterious. Some of the models with a quartic action of symmetry-breaking type display a phase transition. Close to the phase transition the spectrum of a typical Dirac operator shows manifold-like behaviour for the eigenvalues below a cut-off scale

    Learning Graph-Convolutional Representations for Point Cloud Denoising

    Get PDF
    Point clouds are an increasingly relevant data type but they are often corrupted by noise. We propose a deep neural network based on graph-convolutional layers that can elegantly deal with the permutation-invariance problem encountered by learning-based point cloud processing methods. The network is fully-convolutional and can build complex hierarchies of features by dynamically constructing neighborhood graphs from similarity among the high-dimensional feature representations of the points. When coupled with a loss promoting proximity to the ideal surface, the proposed approach significantly outperforms state-of-the-art methods on a variety of metrics. In particular, it is able to improve in terms of Chamfer measure and of quality of the surface normals that can be estimated from the denoised data. We also show that it is especially robust both at high noise levels and in presence of structured noise such as the one encountered in real LiDAR scans.Comment: European Conference on Computer Vision (ECCV) 202

    Robust Poisson Surface Reconstruction

    Full text link
    Abstract. We propose a method to reconstruct surfaces from oriented point clouds with non-uniform sampling and noise by formulating the problem as a convex minimization that reconstructs the indicator func-tion of the surface’s interior. Compared to previous models, our recon-struction is robust to noise and outliers because it substitutes the least-squares fidelity term by a robust Huber penalty; this allows to recover sharp corners and avoids the shrinking bias of least squares. We choose an implicit parametrization to reconstruct surfaces of unknown topology and close large gaps in the point cloud. For an efficient representation, we approximate the implicit function by a hierarchy of locally supported basis elements adapted to the geometry of the surface. Unlike ad-hoc bases over an octree, our hierarchical B-splines from isogeometric analysis locally adapt the mesh and degree of the splines during reconstruction. The hi-erarchical structure of the basis speeds-up the minimization and efficiently represents clustered data. We also advocate for convex optimization, in-stead isogeometric finite-element techniques, to efficiently solve the min-imization and allow for non-differentiable functionals. Experiments show state-of-the-art performance within a more flexible framework.

    Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Get PDF
    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license

    Least Squares Subdivision Surfaces

    Full text link
    International audienc

    (Guest Editors) Deferred Splatting

    No full text
    In recent years it has been shown that, above a certain complexity, points become the most efficient rendering primitives. Although the programmability of the lastest graphics hardware allows efficient implementation of high quality surface splatting algorithms, their performance remains below those obtained with simpler point based rendering algorithms when they are used for scenes of high complexity. In this paper, our goal is to apply high quality point based rendering algorithms on complex scenes. For this purpose, we show how to take advantage of temporal coherency in a very accurate hardware accelerated point selection algorithm allowing the expensive computations to be peformed only on visible points. Our algorithm is based on a multi-pass hardware accelerated EWA splatting. It is also suitable for any rendering application since no pre-process is needed and no assumption is made on the data structure. In addition, we briefly discuss the association of our method with other existing culling techniques and optimization for particular applications. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Viewing algorithms 1

    Computing Contour Trees for 2D Piecewise Polynomial Functions

    Get PDF
    International audienceContour trees are extensively used in scalar field analysis. The contour tree is a data structure that tracks the evolution of levelset topology in a scalar field. Scalar fields are typically available as samples at vertices of a mesh and are linearly interpolatedwithin each cell of the mesh. A more suitable way of representing scalar fields, especially when a smoother function needsto be modeled, is via higher order interpolants. We propose an algorithm to compute the contour tree for such functions. Thealgorithm computes a local structure by connecting critical points using a numerically stable monotone path tracing procedure.Such structures are computed for each cell and are stitched together to obtain the contour tree of the function. The algorithmis scalable to higher degree interpolants whereas previous methods were restricted to quadratic or linear interpolants. Thealgorithm is intrinsically parallelizable and has potential applications to isosurface extraction
    corecore