543 research outputs found

    Machine learning for flow field measurements: a perspective

    Get PDF
    Advancements in machine-learning (ML) techniques are driving a paradigm shift in image processing. Flow diagnostics with optical techniques is not an exception. Considering the existing and foreseeable disruptive developments in flow field measurement techniques, we elaborate this perspective, particularly focused to the field of particle image velocimetry. The driving forces for the advancements in ML methods for flow field measurements in recent years are reviewed in terms of image preprocessing, data treatment and conditioning. Finally, possible routes for further developments are highlighted.Stefano Discetti acknowledges funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (grant agreement No 949085). Yingzheng Liu acknowledges financial support from the National Natural Science Foundation of China (11725209)

    Advances in Motion Estimators for Applications in Computer Vision

    Get PDF
    abstract: Motion estimation is a core task in computer vision and many applications utilize optical flow methods as fundamental tools to analyze motion in images and videos. Optical flow is the apparent motion of objects in image sequences that results from relative motion between the objects and the imaging perspective. Today, optical flow fields are utilized to solve problems in various areas such as object detection and tracking, interpolation, visual odometry, etc. In this dissertation, three problems from different areas of computer vision and the solutions that make use of modified optical flow methods are explained. The contributions of this dissertation are approaches and frameworks that introduce i) a new optical flow-based interpolation method to achieve minimally divergent velocimetry data, ii) a framework that improves the accuracy of change detection algorithms in synthetic aperture radar (SAR) images, and iii) a set of new methods to integrate Proton Magnetic Resonance Spectroscopy (1HMRSI) data into threedimensional (3D) neuronavigation systems for tumor biopsies. In the first application an optical flow-based approach for the interpolation of minimally divergent velocimetry data is proposed. The velocimetry data of incompressible fluids contain signals that describe the flow velocity. The approach uses the additional flow velocity information to guide the interpolation process towards reduced divergence in the interpolated data. In the second application a framework that mainly consists of optical flow methods and other image processing and computer vision techniques to improve object extraction from synthetic aperture radar images is proposed. The proposed framework is used for distinguishing between actual motion and detected motion due to misregistration in SAR image sets and it can lead to more accurate and meaningful change detection and improve object extraction from a SAR datasets. In the third application a set of new methods that aim to improve upon the current state-of-the-art in neuronavigation through the use of detailed three-dimensional (3D) 1H-MRSI data are proposed. The result is a progressive form of online MRSI-guided neuronavigation that is demonstrated through phantom validation and clinical application.Dissertation/ThesisDoctoral Dissertation Electrical Engineering 201

    Variational Correlation and Decomposition Methods for Particle Image Velocimetry

    Get PDF
    Particle Image Velocimetry (PIV) is a non-intrusive optical measurement technique for industrial fluid flow questions. Small particles are introduced into liquids or gases and act as indicators for the movement of the investigated substance around obstacles or in regions where fluids mix. For the two-dimensional variant of the PIV method, a thin plane is illuminated by laser light rendering the particles therein visible. A high speed camera system records an image sequence of the highlighted area. The analysis of this data allows to determine the movement of the particles, and in this way to measure the speed, turbulence or other derived physical properties of the fluid. In state-of-the-art implementations, correspondences between regions of two subsequent image frames are determined using cross-correlation as similarity measurement. In practice it has proven to be robust against disturbances typically found in PIV data. Usually, an exhaustive search over a discrete set of velocity vectors is performed to find the one which describes the data best. In our work we consider a variational formulation of this problem, motivated by the extensive work on variational optical flow methods which allows to incorporate physical priors on the fluid. Furthermore, we replace the usually square shaped correlation window, which defines the image regions whose correspondence is investigated, by a Gaussian function. This design drastically increases the flexibility of the process to adjust to features in the experimental data. A sound criterion is proposed to adapt the size and shape of the correlation window, which directly formulates the aim to improve the measurement accuracy. The velocity measurement and window adaption are formulated as an interdependent variational problem. We apply continuous optimisation methods to determine a solution to this non-linear and non-convex problem. In the experimental section, we demonstrate that our approach can handle both synthetic and real data with high accuracy and compare its performance to state-of-the-art methods. Furthermore, we show that the proposed window adaption scheme increases the measurement accuracy. In particular, high gradients in motion fields are resolved well. In the second part of our work, we investigate an approach for solving very large convex optimisation problems. This is motivated by the fact that a variational formulation on the one hand allows to easily incorporate prior knowledge on data and variables to improve the quality of the solution. Furthermore, convex problems often occur as subprograms of solvers for non-convex optimisation tasks, as it is the case in the first part of this work. However, the extension of two-dimensional approaches to 3D, or to the time axis, as well as the ever increasing resolution of sensors, let the number of variables virtually explode. For many interesting applications, e.g. in medical imaging or fluid mechanics, the problem description easily exceeds the memory limits of available, single computational nodes. Thus, we investigate a decomposition method for the class of unconstrained, convex and quadratic optimisation problems. Our approach is based on the idea of Dual Decomposition, or Lagrangian Relaxation, and splits up the problem into a couple of smaller tasks, which can be distributed to parallel hardware. Each subproblem is again quadratic and convex and thus can be solved efficiently using standard methods. Their interconnection is respected to ensure that we find a solution to the original, non-decomposed problem. Furthermore we propose a framework to modify the numerical properties of the subproblems, which enables us to improve their convergence rates. The theoretical part is completed by the analysis of convergence conditions and rate. Finally, we demonstrate our approach by means of three relevant variational problems from image processing. Error measurements in comparison to single-domain solutions are presented to assess the accuracy of the decomposition

    Assessment and application of wavelet-based optical flow velocimetry (wOFV) to wall-bounded turbulent flows

    Get PDF
    The performance of a wavelet-based optical flow velocimetry (wOFV) algorithm to extract high accuracy and high resolution velocity fields from particle images in wall-bounded turbulent flows is assessed. wOFV is first evaluated using synthetic particle images generated from a channel flow DNS of a turbulent boundary layer. The sensitivity of wOFV to the regularization parameter (lambda) is quantified and results are compared to PIV. Results on synthetic particle images indicated different sensitivity to under-regularization or over-regularization depending on which region of the boundary layer is analyzed. Synthetic data revealed that wOFV can modestly outperform PIV in vector accuracy across a broad lambda range. wOFV showed clear advantages over PIV in resolving the viscous sublayer and obtaining highly accurate estimates of the wall shear stress. wOFV was also applied to experimental data of a developing turbulent boundary layer. Overall, wOFV revealed good agreement with both PIV and PIV + PTV. However, wOFV was able to successfully resolve the wall shear stress and correctly normalize the boundary layer streamwise velocity to wall units where PIV and PIV + PTV showed larger deviations. Analysis of the turbulent velocity fluctuations revealed spurious results for PIV in close proximity to the wall, leading to significantly exaggerated and non-physical turbulence intensity. PIV + PTV showed a minor improvement in this aspect. wOFV did not exhibit this same effect, revealing that it is more accurate in capturing small-scale turbulent motion in the vicinity of boundaries. The enhanced vector resolution of wOFV enabled improved estimation of instantaneous derivative quantities and intricate flow structure both closer to the wall. These aspects show that, within a reasonable lambda range, wOFV can improve resolving the turbulent motion occurring in the vicinity of physical boundaries

    Variational Fluid Motion Estimation with Physical Priors

    Full text link
    In this thesis, techniques for Particle Image Velocimetry (PIV) and Particle Tracking Velocimetry (PTV) are developed that are based on variational methods. The basic idea is not to estimate displacement vectors locally and individually, but to estimate vector fields as a whole by minimizing a suitable functional defined over the entire image domain (which may be 2D or 3D and may also include the temporal dimension). Such functionals typically comprise two terms: a data-term measuring how well two images of a sequence match as a function of the vector field to be estimated, and a regularization term that brings prior knowledge into the energy functional. Our starting point are methods that were originally developed in the field of computer vision and that we modify for the purpose of PIV. These methods are based on the so-called optical flow: Optical flow denotes the estimated velocity vector inferred by a relative motion of camera and image scene and is based on the assumption of gray value conservation (i.e. the total derivative of the image gray value over time is zero). A regularization term (that demands e.g. smoothness of the velocity field, or of its divergence and rotation) renders the system mathematically well-posed. Experimental evaluation shows that this type of variational approach is able to outperform standard cross-correlation methods. In order to develop a variational method for PTV, we replace the continuous data term of variational approaches to PIV with a discrete non-differentiable particle matching term. This raises the problem of minimizing such data terms together with continuous regularization terms. We accomplish this with an advanced mathematical method, which guarantees convergence to a local minimum of such a non-convex variational approach to PTV. With this novel variational approach (there has been no previous work on modeling PTV methods with global variational approaches), we achieve results for image pairs and sequences in two and three dimensions that outperform the relaxation methods that are traditionally used for particle tracking. The key advantage of our variational particle image velocimetry methods, is the chance to include prior knowledge in a natural way. In the fluid environments that we are considering in this thesis, it is especially attractive to use priors that can be motivated from a physical point of view. Firstly, we present a method that only allows flow fields that satisfy the Stokes equation. The latter equation includes control variables that allow to control the optical flow so as to fit the apparent velocities of particles in a given image pair. Secondly, we present a variational approach to motion estimation of instationary fluid flows. This approach extends the prior method along two directions: (i) The full incompressible Navier-Stokes equation is employed in order to obtain a physically consistent regularization which does not suppress turbulent flow variations. (ii) Regularization along the time-axis is employed as well, but formulated in a receding horizon manner contrary to previous approaches to spatio-temporal regularization. Ground-truth evaluations for simulated turbulent flows demonstrate that the accuracy of both types of physically plausible regularization compares favorably with advanced cross-correlation approaches. Furthermore, the direct estimation of, e.g., pressure or vorticity becomes possible

    Two-Dimensional Gel Electrophoresis Image Registration Using Block-Matching Techniques and Deformation Models

    Get PDF
    [Abstract] Block-matching techniques have been widely used in the task of estimating displacement in medical images, and they represent the best approach in scenes with deformable structures such as tissues, fluids, and gels. In this article, a new iterative block-matching technique—based on successive deformation, search, fitting, filtering, and interpolation stages—is proposed to measure elastic displacements in two-dimensional polyacrylamide gel electrophoresis (2D–PAGE) images. The proposed technique uses different deformation models in the task of correlating proteins in real 2D electrophoresis gel images, obtaining an accuracy of 96.6% and improving the results obtained with other techniques. This technique represents a general solution, being easy to adapt to different 2D deformable cases and providing an experimental reference for block-matching algorithms.Galicia. Consellería de Economía e Industria; 10MDS014CTGalicia. Consellería de Economía e Industria; 10SIN105004PRInstituto de Salud Carlos III; PI13/0028

    Stochastic uncertainty models for the luminance consistency assumption

    Get PDF
    International audienceIn this paper, a stochastic formulation of the brightness consistency used in many computer vision problems involving dynamic scenes (motion estimation or point tracking for instance) is proposed. Usually, this model which assumes that the luminance of a point is constant along its trajectory is expressed in a differential form through the total derivative of the luminance function. This differential equation links linearly the point velocity to the spatial and temporal gradients of the luminance function. However when dealing with images, the available informations only hold at discrete time and on a discrete grid. In this paper we formalize the image luminance as a continuous function transported by a flow known only up to some uncertainties related to such a discretization process. Relying on stochastic calculus, we define a formulation of the luminance function preservation in which these uncertainties are taken into account. From such a framework, it can be shown that the usual deterministic optical flow constraint equation corresponds to our stochastic evolution under some strong constraints. These constraints can be relaxed by imposing a weaker temporal assumption on the luminance function and also in introducing anisotropic intensity-based uncertainties. We in addition show that these uncertainties can be computed at each point of the image grid from the image data and provide hence meaningful information on the reliability of the motion estimates. To demonstrate the benefit of such a stochastic formulation of the brightness consistency assumption, we have considered a local least squares motion estimator relying on this new constraint. This new motion estimator improves significantly the quality of the results

    Fluid flow estimation with multiscale ensemble filters based on motion measurements under location uncertainty

    Get PDF
    International audienceThis paper proposes a novel multi-scale fluid flow data assimilation approach, which integrates and complements the advantages of a Bayesian sequential assimilation technique, the Weighted Ensemble Kalman filter (WEnKF). The data assimilation proposed in this work incorporates measurement brought by an efficient multiscale stochastic formulation of the well-known Lucas-Kanade (LK) estimator. This estimator has the great advantage to provide uncertainties associated to the motion measurements at different scales. The proposed assimilation scheme benefits from this multiscale uncertainty information and enables to enforce a physically plausible dynamical consistency of the estimated motion fields along the image sequence. Experimental evaluations are presented on synthetic and real fluid flow sequences

    Modular Optical Flow Estimation With Applications To Fluid Dynamics

    Get PDF
    Optical flow is the apparent motion of intensities in an image sequence. Its estimation has been studied for almost three decades. The results can be used in a wealth of possible applications ranging from scientific applications like experimental fluid dynamics over medical imaging to mobile computer games. The development of a single solution for all optical flow problems seems to be a worthwhile goal. However, in this thesis, we argue that this goal is unlikely to be achieved. We thoroughly motivate this hypothesis with theoretical and practical considerations. Based on the results, we identify two major problems that significantly complicate the research and development of new optical flow algorithms: First, very few reference implementations are publicly available. Second, not all relevant properties of the proposed algorithms are described in literature. In the first part of this thesis, our contribution is to alleviate both problems. First, we discuss a number of algorithm properties which should be known by the user. Second, by decomposing existing optical flow methods into their individual algorithm building blocks, shortly called modules, we propose to individually analyze the properties of each module independently. A large number of existing techniques is composed of relatively few existing modules. By implementing these modules in a software library called Charon and adding tools for the evaluation of the results, we contribute to the accessibility of reference implementations and to the possibility of analyzing algorithms by experiments. In the second part of this thesis, we contribute two modules which are vital for the estimation of fluid flows. They are specifically tuned to the imagery obtained for particle tracking velocimetry (PTV). We call the first module estimatibility measure. It detects those particle locations where fluid motion can be estimated. It is based on the constant position of the center of gravity of the connected components generated by a large number of thresholded versions of the original image. The module only needs a few intuitive parameters. Experiments indicate its robustness with respect to noise with varying mean and variance. To analyze the properties of this module we also provide a framework for simulating the particle image generation. The second module is a motion model based on unsupervised learning via principal component analysis. Training data is provided through Computational Fluid Dynamic (CFD) simulations. The model describes local ensembles of trajectories which can be fitted to the image sequence by means of a similarity measure. Together with a standard similarity measure and a simple optimization scheme we derive a new PTV method. Compared to existing techniques, we obtained superior results with respect to accuracy on real and synthetic sequences with known ground truth. All source code developed during the thesis is available as Open Source following the GNU Lesser General Public License (LGPL)
    corecore