144 research outputs found

    Field theoretic formulation and empirical tracking of spatial processes

    Get PDF
    Spatial processes are attacked on two fronts. On the one hand, tools from theoretical and statistical physics can be used to understand behaviour in complex, spatially-extended multi-body systems. On the other hand, computer vision and statistical analysis can be used to study 4D microscopy data to observe and understand real spatial processes in vivo. On the rst of these fronts, analytical models are developed for abstract processes, which can be simulated on graphs and lattices before considering real-world applications in elds such as biology, epidemiology or ecology. In the eld theoretic formulation of spatial processes, techniques originating in quantum eld theory such as canonical quantisation and the renormalization group are applied to reaction-di usion processes by analogy. These techniques are combined in the study of critical phenomena or critical dynamics. At this level, one is often interested in the scaling behaviour; how the correlation functions scale for di erent dimensions in geometric space. This can lead to a better understanding of how macroscopic patterns relate to microscopic interactions. In this vein, the trace of a branching random walk on various graphs is studied. In the thesis, a distinctly abstract approach is emphasised in order to support an algorithmic approach to parts of the formalism. A model of self-organised criticality, the Abelian sandpile model, is also considered. By exploiting a bijection between recurrent con gurations and spanning trees, an e cient Monte Carlo algorithm is developed to simulate sandpile processes on large lattices. On the second front, two case studies are considered; migratory patterns of leukaemia cells and mitotic events in Arabidopsis roots. In the rst case, tools from statistical physics are used to study the spatial dynamics of di erent leukaemia cell lineages before and after a treatment. One key result is that we can discriminate between migratory patterns in response to treatment, classifying cell motility in terms of sup/super/di usive regimes. For the second case study, a novel algorithm is developed to processes a 4D light-sheet microscopy dataset. The combination of transient uorescent markers and a poorly localised specimen in the eld of view leads to a challenging tracking problem. A fuzzy registration-tracking algorithm is developed to track mitotic events so as to understand their spatiotemporal dynamics under normal conditions and after tissue damage.Open Acces

    Characterising population variability in brain structure through models of whole-brain structural connectivity

    No full text
    Models of whole-brain connectivity are valuable for understanding neurological function. This thesis seeks to develop an optimal framework for extracting models of whole-brain connectivity from clinically acquired diffusion data. We propose new approaches for studying these models. The aim is to develop techniques which can take models of brain connectivity and use them to identify biomarkers or phenotypes of disease. The models of connectivity are extracted using a standard probabilistic tractography algorithm, modified to assess the structural integrity of tracts, through estimates of white matter anisotropy. Connections are traced between 77 regions of interest, automatically extracted by label propagation from multiple brain atlases followed by classifier fusion. The estimates of tissue integrity for each tract are input as indices in 77x77 ”connectivity” matrices, extracted for large populations of clinical data. These are compared in subsequent studies. To date, most whole-brain connectivity studies have characterised population differences using graph theory techniques. However these can be limited in their ability to pinpoint the locations of differences in the underlying neural anatomy. Therefore, this thesis proposes new techniques. These include a spectral clustering approach for comparing population differences in the clustering properties of weighted brain networks. In addition, machine learning approaches are suggested for the first time. These are particularly advantageous as they allow classification of subjects and extraction of features which best represent the differences between groups. One limitation of the proposed approach is that errors propagate from segmentation and registration steps prior to tractography. This can cumulate in the assignment of false positive connections, where the contribution of these factors may vary across populations, causing the appearance of population differences where there are none. The final contribution of this thesis is therefore to develop a common co-ordinate space approach. This combines probabilistic models of voxel-wise diffusion for each subject into a single probabilistic model of diffusion for the population. This allows tractography to be performed only once, ensuring that there is one model of connectivity. Cross-subject differences can then be identified by mapping individual subjects’ anisotropy data to this model. The approach is used to compare populations separated by age and gender

    Variational methods and its applications to computer vision

    Get PDF
    Many computer vision applications such as image segmentation can be formulated in a ''variational'' way as energy minimization problems. Unfortunately, the computational task of minimizing these energies is usually difficult as it generally involves non convex functions in a space with thousands of dimensions and often the associated combinatorial problems are NP-hard to solve. Furthermore, they are ill-posed inverse problems and therefore are extremely sensitive to perturbations (e.g. noise). For this reason in order to compute a physically reliable approximation from given noisy data, it is necessary to incorporate into the mathematical model appropriate regularizations that require complex computations. The main aim of this work is to describe variational segmentation methods that are particularly effective for curvilinear structures. Due to their complex geometry, classical regularization techniques cannot be adopted because they lead to the loss of most of low contrasted details. In contrast, the proposed method not only better preserves curvilinear structures, but also reconnects some parts that may have been disconnected by noise. Moreover, it can be easily extensible to graphs and successfully applied to different types of data such as medical imagery (i.e. vessels, hearth coronaries etc), material samples (i.e. concrete) and satellite signals (i.e. streets, rivers etc.). In particular, we will show results and performances about an implementation targeting new generation of High Performance Computing (HPC) architectures where different types of coprocessors cooperate. The involved dataset consists of approximately 200 images of cracks, captured in three different tunnels by a robotic machine designed for the European ROBO-SPECT project.Open Acces

    Computational Multiscale Methods

    Get PDF
    Many physical processes in material sciences or geophysics are characterized by inherently complex interactions across a large range of non-separable scales in space and time. The resolution of all features on all scales in a computer simulation easily exceeds today's computing resources by multiple orders of magnitude. The observation and prediction of physical phenomena from multiscale models, hence, requires insightful numerical multiscale techniques to adaptively select relevant scales and effectively represent unresolved scales. This workshop enhanced the development of such methods and the mathematics behind them so that the reliable and efficient numerical simulation of some challenging multiscale problems eventually becomes feasible in high performance computing environments

    Diffusion MRI tractography for oncological neurosurgery planning:Clinical research prototype

    Get PDF

    Diffusion MRI tractography for oncological neurosurgery planning:Clinical research prototype

    Get PDF

    Mining for cosmological information: Simulation-based methods for Redshift Space Distortions and Galaxy Clustering

    Get PDF
    The standard model of cosmology describes the complex large scale structure of the Universe through less than 10 free parameters. However, concordance with observations requires that about 95\% of the energy content of the universe is invisible to us. Most of this energy is postulated to be in the form of a cosmological constant, Λ\Lambda, which drives the observed accelerated expansion of the Universe. Its nature is, however, unknown. This mystery forces cosmologists to look for inconsistencies between theory and data, searching for clues. But finding statistically significant contradictions requires extremely accurate measurements of the composition of the Universe, which are at present limited by our inability to extract all the information contained in the data, rather than being limited by the data itself. In this Thesis, we study how we can overcome these limitations by i) modelling how galaxies cluster on small scales with simulation-based methods, where perturbation theory fails to provide accurate predictions, and ii) developing summary statistics of the density field that are capable of extracting more information than the commonly used two-point functions. In the first half, we show how the real to redshift space mapping can be modelled accurately by going beyond the Gaussian approximation for the pairwise velocity distribution. We then show that simulation-based models can accurately predict the full shape of galaxy clustering in real space, increasing the constraining power on some of the cosmological parameters by a factor of 2 compared to perturbation theory methods. In the second half, we measure the information content of density dependent clustering. We show that it can improve the constraints on all cosmological parameters by factors between 3 and 8 over the two-point function. In particular, exploiting the environment dependence can constrain the mass of neutrinos by a factor of 8$ better than the two-point correlation function alone. We hope that the techniques described in this thesis will contribute to extracting all the cosmological information contained in ongoing and upcoming galaxy surveys, and provide insight into the nature of the accelerated expansion of the universe

    Anisotropy Across Fields and Scales

    Get PDF
    This open access book focuses on processing, modeling, and visualization of anisotropy information, which are often addressed by employing sophisticated mathematical constructs such as tensors and other higher-order descriptors. It also discusses adaptations of such constructs to problems encountered in seemingly dissimilar areas of medical imaging, physical sciences, and engineering. Featuring original research contributions as well as insightful reviews for scientists interested in handling anisotropy information, it covers topics such as pertinent geometric and algebraic properties of tensors and tensor fields, challenges faced in processing and visualizing different types of data, statistical techniques for data processing, and specific applications like mapping white-matter fiber tracts in the brain. The book helps readers grasp the current challenges in the field and provides information on the techniques devised to address them. Further, it facilitates the transfer of knowledge between different disciplines in order to advance the research frontiers in these areas. This multidisciplinary book presents, in part, the outcomes of the seventh in a series of Dagstuhl seminars devoted to visualization and processing of tensor fields and higher-order descriptors, which was held in Dagstuhl, Germany, on October 28–November 2, 2018

    Molecular Dynamics Simulation

    Get PDF
    Condensed matter systems, ranging from simple fluids and solids to complex multicomponent materials and even biological matter, are governed by well understood laws of physics, within the formal theoretical framework of quantum theory and statistical mechanics. On the relevant scales of length and time, the appropriate ‘first-principles’ description needs only the Schroedinger equation together with Gibbs averaging over the relevant statistical ensemble. However, this program cannot be carried out straightforwardly—dealing with electron correlations is still a challenge for the methods of quantum chemistry. Similarly, standard statistical mechanics makes precise explicit statements only on the properties of systems for which the many-body problem can be effectively reduced to one of independent particles or quasi-particles. [...

    CHIASM, the human brain albinism and achiasma MRI dataset

    Get PDF
    We describe a collection of T1-, diffusion- and functional T2*-weighted magnetic resonance imaging data from human individuals with albinism and achiasma. This repository can be used as a test-bed to develop and validate tractography methods like diffusion-signal modeling and fiber tracking as well as to investigate the properties of the human visual system in individuals with congenital abnormalities. The MRI data is provided together with tools and files allowing for its preprocessing and analysis, along with the data derivatives such as manually curated masks and regions of interest for performing tractography
    • 

    corecore