2,721 research outputs found
TVL<sub>1</sub> Planarity Regularization for 3D Shape Approximation
The modern emergence of automation in many industries has given impetus to extensive research into mobile robotics. Novel perception technologies now enable cars to drive autonomously, tractors to till a field automatically and underwater robots to construct pipelines. An essential requirement to facilitate both perception and autonomous navigation is the analysis of the 3D environment using sensors like laser scanners or stereo cameras. 3D sensors generate a very large number of 3D data points when sampling object shapes within an environment, but crucially do not provide any intrinsic information about the environment which the robots operate within.
This work focuses on the fundamental task of 3D shape reconstruction and modelling from 3D point clouds. The novelty lies in the representation of surfaces by algebraic functions having limited support, which enables the extraction of smooth consistent implicit shapes from noisy samples with a heterogeneous density. The minimization of total variation of second differential degree makes it possible to enforce planar surfaces which often occur in man-made environments. Applying the new technique means that less accurate, low-cost 3D sensors can be employed without sacrificing the 3D shape reconstruction accuracy
Recommended from our members
TVL<sub>1</sub>shape approximation from scattered 3D data
With the emergence in 3D sensors such as laser scanners and 3D reconstruction from cameras, large 3D point clouds can now be sampled from physical objects within a scene. The raw 3D samples delivered by these sensors however, contain only a limited degree of information about the environment the objects exist in, which means that further geometrical high-level modelling is essential. In addition, issues like sparse data measurements, noise, missing samples due to occlusion, and the inherently huge datasets involved in such representations makes this task extremely challenging. This paper addresses these issues by presenting a new 3D shape modelling framework for samples acquired from 3D sensor. Motivated by the success of nonlinear kernel-based approximation techniques in the statistics domain, existing methods using radial basis functions are applied to 3D object shape approximation. The task is framed as an optimization problem and is extended using non-smooth L1 total variation regularization. Appropriate convex energy functionals are constructed and solved by applying the Alternating Direction Method of Multipliers approach, which is then extended using Gauss-Seidel iterations. This significantly lowers the computational complexity involved in generating 3D shape from 3D samples, while both numerical and qualitative analysis confirms the superior shape modelling performance of this new framework compared with existing 3D shape reconstruction techniques
Recommended from our members
A Knowledge Integration Framework for 3D Shape Reconstruction
The modern emergence of automation in many industries has given impetus to extensive research into mobile robotics. Novel perception technologies now enable cars to drive autonomously, tractors to till a field automatically and underwater robots to construct pipelines. An essential requirement to facilitate both perception and autonomous navigation is the analysis of the 3D environment using sensors like laser scanners or stereo cameras. 3D sensors generate a very large number of 3D data points in sampling object shapes within an environment, but crucially do not provide any intrinsic information about the environment in which the robots operate with. This means unstructured 3D samples must be processed by application-specific models to enable a robot, for instance, to detect and identify objects and infer the scene geometry for path-planning more efficiently than by using raw 3D data. This thesis specifically focuses on the fundamental task of 3D shape reconstruction and modelling by presenting a new knowledge integration framework for unstructured 3D samples. The novelty lies in the representation of surfaces by algebraic functions with limited support, which enables the extraction of smooth consistent shapes from noisy samples with a heterogeneous density. Moreover, many surfaces in urban environments can reasonably be assumed to be planar, and the framework exploits this knowledge to enable effective noise suppression without loss of detail. This is achieved by using a convex optimization technique which has linear computational complexity. Thus is much more efficient than existing solutions. The new framework has been validated by critical experimental analysis and evaluation and has been shown to increase the accuracy of the reconstructed shape significantly compared to state-of-the-art methods. Applying this new knowledge integration framework means that less accurate, low-cost 3D sensors can be employed without sacrificing the high demands that 3D perception must achieve. This links well into the area of robotic inspection, as for example regarding small drones that use inaccurate and lightweight image sensors
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Imaging spectrometers measure electromagnetic energy scattered in their
instantaneous field view in hundreds or thousands of spectral channels with
higher spectral resolution than multispectral cameras. Imaging spectrometers
are therefore often referred to as hyperspectral cameras (HSCs). Higher
spectral resolution enables material identification via spectroscopic analysis,
which facilitates countless applications that require identifying materials in
scenarios unsuitable for classical spectroscopic analysis. Due to low spatial
resolution of HSCs, microscopic material mixing, and multiple scattering,
spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus,
accurate estimation requires unmixing. Pixels are assumed to be mixtures of a
few materials, called endmembers. Unmixing involves estimating all or some of:
the number of endmembers, their spectral signatures, and their abundances at
each pixel. Unmixing is a challenging, ill-posed inverse problem because of
model inaccuracies, observation noise, environmental conditions, endmember
variability, and data set size. Researchers have devised and investigated many
models searching for robust, stable, tractable, and accurate unmixing
algorithms. This paper presents an overview of unmixing methods from the time
of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models
are first discussed. Signal-subspace, geometrical, statistical, sparsity-based,
and spatial-contextual unmixing algorithms are described. Mathematical problems
and potential solutions are described. Algorithm characteristics are
illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of
Selected Topics in Applied Earth Observations and Remote Sensin
Shape reconstruction from gradient data
We present a novel method for reconstructing the shape of an object from
measured gradient data. A certain class of optical sensors does not measure the
shape of an object, but its local slope. These sensors display several
advantages, including high information efficiency, sensitivity, and robustness.
For many applications, however, it is necessary to acquire the shape, which
must be calculated from the slopes by numerical integration. Existing
integration techniques show drawbacks that render them unusable in many cases.
Our method is based on approximation employing radial basis functions. It can
be applied to irregularly sampled, noisy, and incomplete data, and it
reconstructs surfaces both locally and globally with high accuracy.Comment: 16 pages, 5 figures, zip-file, submitted to Applied Optic
Principal component and Voronoi skeleton alternatives for curve reconstruction from noisy point sets
Surface reconstruction from noisy point samples must take into consideration the stochastic nature of the sample -- In other words, geometric algorithms reconstructing the surface or curve should not insist in following in a literal way each sampled point -- Instead, they must interpret the sample as a “point cloud” and try to build the surface as passing through the best possible (in the statistical sense) geometric locus that represents the sample -- This work presents two new methods to find a Piecewise Linear approximation from a Nyquist-compliant stochastic sampling of a quasi-planar C1 curve C(u) : R → R3, whose velocity vector never vanishes -- One of the methods articulates in an entirely new way Principal Component Analysis (statistical) and Voronoi-Delaunay (deterministic) approaches -- It uses these two methods to calculate the best possible tape-shaped polygon covering the planarised point set, and then approximates the manifold by the medial axis of such a polygon -- The other method applies Principal Component Analysis to find a direct Piecewise Linear approximation of C(u) -- A complexity comparison of these two methods is presented along with a qualitative comparison with previously developed ones -- It turns out that the method solely based on Principal Component Analysis is simpler and more robust for non self-intersecting curves -- For self-intersecting curves the Voronoi-Delaunay based Medial Axis approach is more robust, at the price of higher computational complexity -- An application is presented in Integration of meshes originated in range images of an art piece -- Such an application reaches the point of complete reconstruction of a unified mes
Image Reconstruction from Undersampled Confocal Microscopy Data using Multiresolution Based Maximum Entropy Regularization
We consider the problem of reconstructing 2D images from randomly
under-sampled confocal microscopy samples. The well known and widely celebrated
total variation regularization, which is the L1 norm of derivatives, turns out
to be unsuitable for this problem; it is unable to handle both noise and
under-sampling together. This issue is linked with the notion of phase
transition phenomenon observed in compressive sensing research, which is
essentially the break-down of total variation methods, when sampling density
gets lower than certain threshold. The severity of this breakdown is determined
by the so-called mutual incoherence between the derivative operators and
measurement operator. In our problem, the mutual incoherence is low, and hence
the total variation regularization gives serious artifacts in the presence of
noise even when the sampling density is not very low. There has been very few
attempts in developing regularization methods that perform better than total
variation regularization for this problem. We develop a multi-resolution based
regularization method that is adaptive to image structure. In our approach, the
desired reconstruction is formulated as a series of coarse-to-fine
multi-resolution reconstructions; for reconstruction at each level, the
regularization is constructed to be adaptive to the image structure, where the
information for adaption is obtained from the reconstruction obtained at
coarser resolution level. This adaptation is achieved by using maximum entropy
principle, where the required adaptive regularization is determined as the
maximizer of entropy subject to the information extracted from the coarse
reconstruction as constraints. We demonstrate the superiority of the proposed
regularization method over existing ones using several reconstruction examples
Interpolation and scattered data fitting on manifolds using projected Powell–Sabin splines
We present methods for either interpolating data or for fitting scattered data on a two-dimensional smooth manifold. The methods are based on a local bivariate Powell-Sabin interpolation scheme, and make use of a family of charts {(Uξ , ξ)}ξ∈ satisfying certain conditions of smooth dependence on ξ. If is a C2-manifold embedded into R3, then projections into tangent planes can be employed. The data fitting method is a two-stage method. We prove that the resulting function on the manifold is continuously differentiable, and establish error bounds for both methods for the case when the data are generated by a smooth function
- …