98 research outputs found

    Volumetric velocimetry for fluid flows

    Get PDF
    In recent years, several techniques have been introduced that are capable of extracting 3D three-component velocity fields in fluid flows. Fast-paced developments in both hardware and processing algorithms have generated a diverse set of methods, with a growing range of applications in flow diagnostics. This has been further enriched by the increasingly marked trend of hybridization, in which the differences between techniques are fading. In this review, we carry out a survey of the prominent methods, including optical techniques and approaches based on medical imaging. An overview of each is given with an example of an application from the literature, while focusing on their respective strengths and challenges. A framework for the evaluation of velocimetry performance in terms of dynamic spatial range is discussed, along with technological trends and emerging strategies to exploit 3D data. While critical challenges still exist, these observations highlight how volumetric techniques are transforming experimental fluid mechanics, and that the possibilities they offer have just begun to be explored.SD was partially supported under Grant No. DPI2016-79401-R funded by the Spanish State Research Agency (SRA) and the European Regional Development Fund (ERDF). FC was partially supported by the U.S. National Science Foundation (Chemical, Bioengineering, Environmental, and Transport Systems, Grant No. 1453538)

    Inverse problems in acoustic tomography:theory and applications

    Get PDF
    Acoustic tomography aims at recovering the unknown parameters that describe a field of interest by studying the physical characteristics of sound propagating through the considered field. The tomographic approach is appealing in that it is non-invasive and allows to obtain a significantly larger amount of data compared to the classical one-sensor one-measurement setup. It has, however, two major drawbacks which may limit its applicability in a practical setting: the methods by which the tomographic data are acquired and then converted to the field values are computationally intensive and often ill-conditioned. This thesis specifically addresses these two shortcomings by proposing novel acoustic tomography algorithms for signal acquisition and field reconstruction. The first part of our exposition deals with some theoretical aspects of the tomographic sampling problems and associated reconstruction schemes for scalar and vector tomography. We show that the classical time-of-flight measurements are not sufficient for full vector field reconstruction. As a solution, an additional set of measurements is proposed. The main advantage of the proposed set is that it can be directly computed from acoustic measurements. It thus avoids the need for extra measuring devices. We then describe three novel reconstruction methods that are conceptually quite different. The first one is based on quadratic optimization and does not require any a priori information. The second method builds upon the notion of sparsity in order to increase the reconstruction accuracy when little data is available. The third approach views tomographic reconstruction as a parametric estimation problem and solves it using recent sampling results on non-bandlimited signals. The proposed methods are compared and their respective advantages are outlined. The second part of our work is dedicated to the application of the proposed algorithms to three practical problems: breast cancer detection, thermal therapy monitoring, and temperature monitoring in the atmosphere. We address the problem of breast cancer detection by computing a map of sound speed in breast tissue. A noteworthy contribution of this thesis is the development of a signal processing technique that significantly reduces the artifacts that arise in very inhomogeneous and absorbent tissue. Temperature monitoring during thermal therapies is then considered. We show how some of our algorithms allow for an increased spatial resolution and propose ways to reduce the computational complexity. Finally, we demonstrate the feasibility of tomographic temperature monitoring in the atmosphere using a custom-built laboratory-scale experiment. In particular, we discuss various practical aspects of time-of-flight measurement using cheap, off-the-shelf sensing devices

    Main results of the first Data Assimilation Challenge

    Get PDF
    This work presents the main results of the first Lagrangian Particle Tracking challenge, conducted within the framework of the European Union’s Horizon 2020 project HOMER (Holistic Optical Metrology for Aero-Elastic Research), grant agreement number 769237. The challenge, jointly organised by the research groups of DLR, ONERA and TU Delft, considered a synthetic experiment reproducing the wall-bounded flow in the wake of a cylinder which was simulated by LES. The participants received the calibration images and sets of particle images acquired by four virtual cameras, and were asked to produce as output the particles positions, velocities and accelerations (when possible) at a specific time instant. Four different image acquisition strategies were addressed, namely two-pulse (TP), four-pulse (FP) and time-resolved (TR) acquisitions, each with varying tracer particle concentrations (or number of particles per pixel, ppp). The participants’ outputs were analysed in terms of percentages of correctly reconstructed particles, missed particles, ghost particles, correct tracks and wrong tracks, as well as in terms of position, velocity and acceleration errors, along with their distributions. The analysis of the results showed that the best-performing algorithms allow for a correct reconstruction of more than 99% of the tracer particles with positional errors below 0.1 pixels even at ppp values exceeding 0.15, whereas other algorithms are more prone to the presence of ghost particles already for ppp < 0.1. While the velocity errors remained contained within a small percentage of the bulk velocity, acceleration errors as large as 50% of the actual acceleration magnitude were retrieved

    Uncovering solutions from data corrupted by systematic errors: A physics-constrained convolutional neural network approach

    Full text link
    Information on natural phenomena and engineering systems is typically contained in data. Data can be corrupted by systematic errors in models and experiments. In this paper, we propose a tool to uncover the spatiotemporal solution of the underlying physical system by removing the systematic errors from data. The tool is the physics-constrained convolutional neural network (PC-CNN), which combines information from both the systems governing equations and data. We focus on fundamental phenomena that are modelled by partial differential equations, such as linear convection, Burgers equation, and two-dimensional turbulence. First, we formulate the problem, describe the physics-constrained convolutional neural network, and parameterise the systematic error. Second, we uncover the solutions from data corrupted by large multimodal systematic errors. Third, we perform a parametric study for different systematic errors. We show that the method is robust. Fourth, we analyse the physical properties of the uncovered solutions. We show that the solutions inferred from the PC-CNN are physical, in contrast to the data corrupted by systematic errors that does not fulfil the governing equations. This work opens opportunities for removing epistemic errors from models, and systematic errors from measurements

    Enforcing Temporal Consistency in Physically Constrained Flow Field Reconstruction with FlowFit by Use of Virtual Tracer Particles

    Get PDF
    Processing techniques for particle based optical flow measurement data such as 3D Particle Tracking Velocimetry (PTV) or the novel dense Lagrangian Particle Tracking method Shake-The-Box (STB) can provide time-series of velocity and acceleration information scattered in space. The following post-processing is key to the quality of space-filling velocity and pressure field reconstruction from the scattered particle data. In this work we describe a straight-forward extension of the recently developed data assimilation scheme FlowFit, which applies physical constraints from the Navier-Stokes equations in order to simultaneously determine velocity and pressure fields as solutions to an inverse problem. We propose the use of additional artificial Lagrangian tracers (virtual particles), which are advected between the flow fields at single time instants to achieve meaningful temporal coupling. This is the most natural way of a temporal constraint in the Lagrangian data framework. Not FlowFit's core method is altered in the current work, but its input in form of Lagrangian tracks. This work shows that the introduction of such particle memory to the reconstruction process significantly improves the resulting flow fields. The method is validated in virtual experiments with two independent DNS test cases. Several contributions are revised to explain the improvements, including correlations of velocity and acceleration errors in the reconstructions and the flow field regularization within the inverse problem

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Dynamic response testing method of damage detection in fibre composite structures

    Get PDF
    This dissertation analyses the practicality of the use of vibration signatures as a method to monitor structural health of a glass pultrusion square hollow section member and to detect the existence, magnitude and location of damage. 12 nodes spaced evenly along the 1m long beam were the test locations where an accelerometer was placed and an impact hammer was used to cause the forced oscillation. The data was collected using an LMS data acquisition system coupled with the LMS Xpress testware. Two methods of identifying damage were applied. The empirical mode decomposition and the Hilbert-Huang transformation. The empirical mode decomposition proved inadequate at signifying damage, however, the Hilbert-Huang transformation showed clear indication of damage introduction. Figures of all numerical analysis and results are included. Also pictures to aid in understanding of the experiments are also included. A copy of the entire Matlab code used in the numerical analysis of the oscillatory signal. Based on the results of the experiments and numerical analysis it is deemed that with further research it may be possible to build a industrially suitable dynamic testing method for a wide variety of complex materials that avoids the requirement of specialised technicians, abundant apparatus, lengthy time and costly processes
    • …
    corecore