6 research outputs found

    Self-similar prior and wavelet bases for hidden incompressible turbulent motion

    Get PDF
    This work is concerned with the ill-posed inverse problem of estimating turbulent flows from the observation of an image sequence. From a Bayesian perspective, a divergence-free isotropic fractional Brownian motion (fBm) is chosen as a prior model for instantaneous turbulent velocity fields. This self-similar prior characterizes accurately second-order statistics of velocity fields in incompressible isotropic turbulence. Nevertheless, the associated maximum a posteriori involves a fractional Laplacian operator which is delicate to implement in practice. To deal with this issue, we propose to decompose the divergent-free fBm on well-chosen wavelet bases. As a first alternative, we propose to design wavelets as whitening filters. We show that these filters are fractional Laplacian wavelets composed with the Leray projector. As a second alternative, we use a divergence-free wavelet basis, which takes implicitly into account the incompressibility constraint arising from physics. Although the latter decomposition involves correlated wavelet coefficients, we are able to handle this dependence in practice. Based on these two wavelet decompositions, we finally provide effective and efficient algorithms to approach the maximum a posteriori. An intensive numerical evaluation proves the relevance of the proposed wavelet-based self-similar priors.Comment: SIAM Journal on Imaging Sciences, 201

    Spatially-variant kernel for optical flow under low signal-to-noise ratios: application to microscopy

    Get PDF
    International audienceLocal and global approaches can be identified as the two main classes of optical flow estimation methods. In this paper, we propose a framework to combine the advantages of these two principles, namely robustness to noise of the local approach and discontinuity preservation of the global approach. This is particularly crucial in biological imaging, where the noise produced by microscopes is one of the main issues for optical flow estimation. The idea is to adapt spatially the local support of the local parametric constraint in the combined local-global model [6]. To this end, we jointly estimate the motion field and the parameters of the spatial support. We apply our approach to the case of Gaussian filtering, and we derive efficient minimization schemes for usual data terms. The estimation of a spatially varying standard deviation map prevents from the smoothing of motion discontinuities, while ensuring robustness to noise. We validate our method for a standard model and demonstrate how a baseline approach with pixel-wise data term can be improved when integrated in our framework. The method is evaluated on the Middlebury benchmark with ground truth and on real fluorescence microscopy data

    Bayesian Estimation of Turbulent Motion

    Full text link

    RGB-D Scene Flow via Grouping Rigid Motions

    Get PDF
    Robotics and artificial intelligence have seen drastic advancements in technology and algorithms over the last decade. Computer vision algorithms play a crucial role in enabling robots and machines to understand their environment. A fundamental cue in understanding environments is analyzing the motions within the scene, otherwise known as scene flow. Scene flow estimates the 3D velocity of each imaged point captured by a camera. The 3D information of the scene can be acquired by RGB-D cameras, which produce both colour and depth images and have been proven to be useful for solving many computer vision tasks. Scene flow has numerous applications such as motion segmentation, 3D mapping, robotic navigation and obstacle avoidance, gesture recognition, etc. Most state-of-the-art RGB-D scene flow methods are set in a variational framework and formulated as an energy minimization problem. While these methods are able to provide high accuracy, they are computationally expensive and not robust under larger motions in the scene. The main contributions of this research is a method for efficiently estimating approximate RGB-D scene flow. A new approach to scene flow estimation has been introduced based on matching 3D points from one frame to the next in a hierarchical fashion. One main observation that is used is that most scene motions in everyday life consist of rigid motions. As such, large parts of the scene will follow the same motion. The new method takes advantage of this fact by attempting to group the 3D data in each frame according to like-motions using concepts from spectral clustering. A simple coarse-to-fine voxelization scheme is used to provide fast estimates of motion and accommodate for larger motions. This is a much more tractable approach than existing methods and does not depend on convergence of some defined objective function in an optimization framework. By assuming the scene is composed of rigidly moving parts, non-rigid motions are not accurately estimated and hence the method is an approximate scene flow estimation. Still, quickly determining approximate motions in a scene is tremendously useful for any computer vision tasks that benefit from motion cues. Evaluation is performed on a custom RGB-D dataset because existing RGB-D scene flow datasets presented to date are mostly based on qualitative evaluation. The dataset consists of real scenes that demonstrates realistic scene flow. Experimental results show that the presented method can provide reliable scene flow estimates at significantly faster runtime speed and can handle larger motions better than current methods

    Adaptive Sampling in Particle Image Velocimetry

    Get PDF

    Variational Adaptive Correlation Method for Flow Estimation

    No full text
    A variational approach is presented to the estimation of turbulent fluid flow from particle image sequences in experimental fluid mechanics. The approach comprises two coupled optimizations for adapting size and shape of a Gaussian correlation window at each location and for estimating the flow, respectively. Themethod copeswith a wide range of particle densities and image noise levels without any data-specific parameter tuning. Based on a careful implementation of a multiscale nonlinear optimization technique, we demonstrate robustness of the solution over typical experimental scenarios and highest estimation accuracy for an international benchmark data set (PIV Challenge)
    corecore