323 research outputs found

    3D Lagrangian Particle Tracking in Fluid Mechanics

    Get PDF
    In the past few decades various particle image based volumetric flow measurement techniques have been developed which showed their potential in accessing unsteady flow properties quantitatively in various experimental applications in fluid mechanics. In this article we would like to focus on physical properties and circumstances of 3D particle-based measurements and what knowledge can be used for gaining advancements in the reconstruction accuracy, spatial and temporal resolution and completeness. The natural candidate for our focus is 3D Lagrangian Particle Tracking (LPT), which allows determining position, velocity and acceleration along a large number of individual particle tracks in the investigated volume. With the advent of the dense 3D LPT technique Shake-The-Box in the past decade further possibilities for characterizing unsteady flows have been opened by delivering input data for powerful data assimilation techniques which use Navier-Stokes constraints. As a result, high-resolution Lagrangian and Eulerian data can be gained including long particle trajectories embedded in time-resolved 3D velocity- and pressure fields

    Segmentation-Driven Tomographic Reconstruction.

    Get PDF

    System Characterizations and Optimized Reconstruction Methods for Novel X-ray Imaging

    Get PDF
    In the past decade there have been many new emerging X-ray based imaging technologies developed for different diagnostic purposes or imaging tasks. However, there exist one or more specific problems that prevent them from being effectively or efficiently employed. In this dissertation, four different novel X-ray based imaging technologies are discussed, including propagation-based phase-contrast (PB-XPC) tomosynthesis, differential X-ray phase-contrast tomography (D-XPCT), projection-based dual-energy computed radiography (DECR), and tetrahedron beam computed tomography (TBCT). System characteristics are analyzed or optimized reconstruction methods are proposed for these imaging modalities. In the first part, we investigated the unique properties of propagation-based phase-contrast imaging technique when combined with the X-ray tomosynthesis. Fourier slice theorem implies that the high frequency components collected in the tomosynthesis data can be more reliably reconstructed. It is observed that the fringes or boundary enhancement introduced by the phase-contrast effects can serve as an accurate indicator of the true depth position in the tomosynthesis in-plane image. In the second part, we derived a sub-space framework to reconstruct images from few-view D-XPCT data set. By introducing a proper mask, the high frequency contents of the image can be theoretically preserved in a certain region of interest. A two-step reconstruction strategy is developed to mitigate the risk of subtle structures being oversmoothed when the commonly used total-variation regularization is employed in the conventional iterative framework. In the thirt part, we proposed a practical method to improve the quantitative accuracy of the projection-based dual-energy material decomposition. It is demonstrated that applying a total-projection-length constraint along with the dual-energy measurements can achieve a stabilized numerical solution of the decomposition problem, thus overcoming the disadvantages of the conventional approach that was extremely sensitive to noise corruption. In the final part, we described the modified filtered backprojection and iterative image reconstruction algorithms specifically developed for TBCT. Special parallelization strategies are designed to facilitate the use of GPU computing, showing demonstrated capability of producing high quality reconstructed volumetric images with a super fast computational speed. For all the investigations mentioned above, both simulation and experimental studies have been conducted to demonstrate the feasibility and effectiveness of the proposed methodologies

    Bayesian Variational Regularisation for Dark Matter Reconstruction with Uncertainty Quantification

    Get PDF
    Despite the great wealth of cosmological knowledge accumulated since the early 20th century, the nature of dark-matter, which accounts for ~85% of the matter content of the universe, remains illusive. Unfortunately, though dark-matter is scientifically interesting, with implications for our fundamental understanding of the Universe, it cannot be directly observed. Instead, dark-matter may be inferred from e.g. the optical distortion (lensing) of distant galaxies which, at linear order, manifests as a perturbation to the apparent magnitude (convergence) and ellipticity (shearing). Ensemble observations of the shear are collected and leveraged to construct estimates of the convergence, which can directly be related to the universal dark-matter distribution. Imminent stage IV surveys are forecast to accrue an unprecedented quantity of cosmological information; a discriminative partition of which is accessible through the convergence, and is disproportionately concentrated at high angular resolutions, where the echoes of cosmological evolution under gravity are most apparent. Capitalising on advances in probability concentration theory, this thesis merges the paradigms of Bayesian inference and optimisation to develop hybrid convergence inference techniques which are scalable, statistically principled, and operate over the Euclidean plane, celestial sphere, and 3-dimensional ball. Such techniques can quantify the plausibility of inferences at one-millionth the computational overhead of competing sampling methods. These Bayesian techniques are applied to the hotly debated Abell-520 merging cluster, concluding that observational catalogues contain insufficient information to determine the existence of dark-matter self-interactions. Further, these techniques were applied to all public lensing catalogues, recovering the then largest global dark-matter mass-map. The primary methodological contributions of this thesis depend only on posterior log-concavity, paving the way towards a, potentially revolutionary, complete hybridisation with artificial intelligence techniques. These next-generation techniques are the first to operate over the full 3-dimensional ball, laying the foundations for statistically principled universal dark-matter cartography, and the cosmological insights such advances may provide

    Microwave Sensing and Imaging

    Get PDF
    In recent years, microwave sensing and imaging have acquired an ever-growing importance in several applicative fields, such as non-destructive evaluations in industry and civil engineering, subsurface prospection, security, and biomedical imaging. Indeed, microwave techniques allow, in principle, for information to be obtained directly regarding the physical parameters of the inspected targets (dielectric properties, shape, etc.) by using safe electromagnetic radiations and cost-effective systems. Consequently, a great deal of research activity has recently been devoted to the development of efficient/reliable measurement systems, which are effective data processing algorithms that can be used to solve the underlying electromagnetic inverse scattering problem, and efficient forward solvers to model electromagnetic interactions. Within this framework, this Special Issue aims to provide some insights into recent microwave sensing and imaging systems and techniques

    On the clinical potential of ion computed tomography with different detector systems and ion species

    Get PDF

    Non-equilibrium dynamics in quantum simulators

    Get PDF
    This thesis summarizes a course of investigation of various aspects of non-equilibrium dynamics in isolated quantum systems which can be controlled to the extent that one can speak of not just realizing but rather simulating a desired physical effect. The first subject considered concerns a general question of relaxation in a large class of physical models. It is rigorously proven that equilibration can occur for arbitrary local observables despite the entire system being perfectly isolated. Various mechanisms responsible for convergence to local equilibrium are highlighted. These involve in particular the memory loss of non-Gaussian correlations following an interaction quench, a notion of Gaussian ergodicity and a proof of the emergence of translation invariance of correlations due to the presence of this symmetry in the Hamiltonian governing the evolution. These results provide the long time and large system size asymptotics facilitating a thermodynamic limit, but at the same time are relevant for state-of-the-art quantum simulation experiments with large numbers of ultra-cold atoms: A related effect has been observed in a one-dimensional phononic quantum field simulator and additionally a method is provided to study relaxation dynamics of this type in optical lattice quantum simulators. Within the second theme explored in this thesis a novel quantum read-out method is proposed and applied in continuous field quantum simulators which allowed for the first time to measure experimentally various thermodynamical properties of one-dimensional quasi-condensates. In particular, tomographic results concerning thermal properties, non-commuting observables, momentum and time-resolved occupation numbers of phonons are presented. Finally, ideas for practical benchmarking of the dynamics of certain closed quantum systems are put forward, based on the concept of a fidelity witness. It is demonstrated that fidelity, despite being a sensitive measure for large systems, can be efficiently estimated for non-equilibrium dynamics in coherent quantum simulators implementing paradigmatic models of condensed matter physics. The method developed has already found an independent application in studies of variational quantum circuits aiming at achieving so-called quantum chemistry accuracy using the Sycamore quantum processor. The fact that all three themes of research laid out in this thesis have found an experimental realization hints at a prognosis for future developments in physics that it will become standard that quantum simulators will realize experimentally novel theoretical ideas on demand and the time between theoretical insights and experimental observations will be dramatically shortened.Diese Dissertation fasst eine Reihe von Untersuchungen zu unterschiedlichen Aspekten von Nichtgleichgewichtsdynamik in isolierten Quantensystemen zusammen, die in einem Maße präzise kontrolliert werden können, dass man nicht nur von der Realisierung eines physikalischen Effektes, sondern von seiner Simulation sprechen kann. Das erste Thema befasst sich mit einer allgemeiner Frage der Relaxationdynamik in einer grosser Klasse von physikalischen Modellen. Es wird in diesem Rahmen rigoros bewiesen, dass eine Equilibrierung von beliebigen lokalen Observablen auch dann generisch vorliegen kann, wenn das System perfekt isoliert bleibt. Unterschiedliche Mechanismen werden herausgestellt, die für die Konvergenz zu lokalem Gleichgewicht verantwortlich sind. Dies betrifft insbesondere der Gedächtnisverlust von nicht-Gaußchen Korrelationen nach schnellen Änderungen von Wechselwirkungen, eine Begrifflichkeit von Gaußscher Ergodizität und der Beweis einer Emergenz von Translationsinvarianz in Situationen, in denen der Hamiltonoperator eine solche Symmetrie aufweist. Diese Resultate ergeben die Asymptotik eines Übergangs zu langen Zeiten und großen Systemen, die einen thermodynamischen Limes abbilden. Sie sind aber gleichermaßen relevant für moderne Quantensimulationsexperimente, wie sie derzeit mit großskaligen Systemen ultrakalter Atome durchgeführt werden: Ein artverwandter Effekt wurde in einem eindimensionalen Quantenfeldsimulator beobachtet. Aufbauend auf diesen Ergebnissen werden Methoden bereitgestellt zur Untersuchung der Nichtgleichgewichtsdynamik von Systemen ultrakalter Atome in optischen Gittern. Im zweiten Teil der Arbeit wird eine neuartige Auslesemethode vorgeschlagen und auf Quantensimulatoren kontinuierlicher Quantenfelder angewendet, die tatsächlich experimentell erprobt wurde, was erstmals erlaubte, verschiedene thermodynamische Eigenschaften von eindimensionalen Quasikondensaten experimentell zu vermessen. Insbesondere werden tomographische Resultate über thermische Eigenschaften präsentiert, über Erwartungswerte von nichtkommutierenden Observablen und auch Besetzungen von Phononenmoden, in Impuls und Zeit aufgelöst. Schließlich werden Ideen vorgestellt über die Zertifikation der Quantendynamik abgeschlossener Quantensysteme, basierende auf der Idee eines sogenannten Fidelitätszeugen. Es wird gezeigt, dass die Fidelität - eine inhärent fragile Größe für große Quantensysteme - effizient geschätzt werden kann für die Nichtgleichgewichtsdynamik kohärenter Quantensimulatoren, die paradigmatische Systeme aus der Physik der kondensierten Materie implementieren. Die so entwickelte Methode hat bereits eine unabhängige Anwendung gefunden in Studien variationeller Quantenschaltkreise, die darauf abzielen, das Genauigkeitsniveau der Quantenchemie zu erreichen, den Sycamore Quantenprozessor verwendend. Die Tatsache, dass alle drei in dieser Dissertation präsentierten theoretischen Forschungsrichtungen bereits experimentell realisiert werden konnten, deutet darauf hin, dass hier eine Prognose aufgegriffen werden kann, nach der es Standard wird, dass Quantensimulatoren theoretische Ideen gezielt aufgreifen können und die Zeit zwischen theoretischer Einsicht und experimenteller Bestätigung dramatisch verkürzt wird
    corecore