101 research outputs found

    Information theoretic regularization in diffuse optical tomography

    Get PDF
    Diffuse optical tomography (DOT) retrieves the spatially distributed optical characteristics of a medium from external measurements. Recovering these parameters of interest involves solving a non-linear and severely ill-posed inverse problem. In this thesis we propose methods towards the regularization of DOT via the introduction of spatially unregistered, a priori information from alternative high resolution anatomical modalities, using the information theory concepts of joint entropy (JE) and mutual information (MI). Such functionals evaluate the similarity between the reconstructed optical image and the prior image, while bypassing the multi-modality barrier manifested as the incommensurate relation between the gray value representations of corresponding anatomical features in the modalities involved. By introducing structural a priori information in the image reconstruction process, we aim to improve the spatial resolution and quantitative accuracy of the solution. A further condition for the accurate incorporation of a priori information is the establishment of correct alignment between the prior image and the probed anatomy in a common coordinate system. However, limited information regarding the probed anatomy is known prior to the reconstruction process. In this work we explore the potentiality of spatially registering the prior image simultaneously with the solution of the reconstruction process. We provide a thorough explanation of the theory from an imaging perspective, accompanied by preliminary results obtained by numerical simulations as well as experimental data. In addition we compare the performance of MI and JE. Finally, we propose a method for fast joint entropy evaluation and optimization, which we later employ for the information theoretic regularization of DOT. The main areas involved in this thesis are: inverse problems, image reconstruction & regularization, diffuse optical tomography and medical image registration

    Image Restoration

    Get PDF
    This book represents a sample of recent contributions of researchers all around the world in the field of image restoration. The book consists of 15 chapters organized in three main sections (Theory, Applications, Interdisciplinarity). Topics cover some different aspects of the theory of image restoration, but this book is also an occasion to highlight some new topics of research related to the emergence of some original imaging devices. From this arise some real challenging problems related to image reconstruction/restoration that open the way to some new fundamental scientific questions closely related with the world we interact with

    Hybrid Inflation: Multi-field Dynamics and Cosmological Constraints

    Full text link
    The dynamics of hybrid models is usually approximated by the evolution of a scalar field slowly rolling along a nearly flat valley. Inflation ends with a waterfall phase, due to a tachyonic instability. This final phase is usually assumed to be nearly instantaneous. In this thesis, we go beyond these approximations and analyze the exact 2-field dynamics of hybrid models. Several effects are put in evidence: 1) the possible slow-roll violations along the valley induce the non existence of inflation at small field values. Provided super-planckian fields, the scalar spectrum of the original model is red, in agreement with observations. 2) The initial field values are not fine-tuned along the valley but also occupy a considerable part of the field space exterior to it. They form a structure with fractal boundaries. Using bayesian methods, their distribution in the whole parameter space is studied. Natural bounds on the potential parameters are derived. 3) For the original model, inflation is found to continue for more than 60 e-folds along waterfall trajectories in some part of the parameter space. The scalar power spectrum of adiabatic perturbations is modified and is generically red, possibly in agreement with CMB observations. Topological defects are conveniently stretched outside the observable Universe. 4) The analysis of the initial conditions is extended to the case of a closed Universe, in which the initial singularity is replaced by a classical bounce. In the third part of the thesis, we study how the present CMB constraints on the cosmological parameters could be ameliorated with the observation of the 21cm cosmic background, by future giant radio-telescopes. Forecasts are determined for a characteristic Fast Fourier Transform Telescope, by using both Fisher matrix and MCMC methods.Comment: 218 pages, PhD thesis, June 201

    Computational methods to engineer process-structure-property relationships in organic electronics: The case of organic photovoltaics

    Get PDF
    Ever since the Nobel prize winning work by Heeger and his colleagues, organic electronics enjoyed increasing attention from researchers all over the world. While there is a large potential for organic electronics in areas of transistors, solar cells, diodes, flexible displays, RFIDs, smart textiles, smart tattoos, artificial skin, bio-electronics, medical devices and many more, there have been very few applications that reached the market. Organic photovoltaics especially can utilize large market of untapped solar power -- portable and affordable solar conversion devices. While there are several reasons for their unavailability, a major one is the challenge of controlling device morphology at several scales, simultaneously. The morphology is intricately related to the processing of the device and strongly influences performance. Added to this is the unending development of new polymeric materials in search of high power conversion efficiencies. Fully understanding this intricate relationship between materials, processing conditions and power conversion is highly resource and time intensive. The goal of this work is to provide tightly coupled computational routes to these expensive experiments, and demonstrate process control using in-silico experiments. This goal is achieved in multiple stages and is commonly called the process-structure-property loop in material science community. We leverage recent advances in high performance computing (HPC) and high throughput computing (HTC) towards this end. Two open-source software packages were developed: GRATE and PARyOpt. GRATE provides a means to reliably and repeatably quantify TEM images for identifying transport characteristics. It solves the problem of manually quantifying large number of large images with fine details. PARyOpt is a Gaussian process based optimization library that is especially useful for optimizing expensive phenomena. Both these are highly modular and designed to be easily integrated with existing software. It is anticipated that the organic electronics community will use these tools to accelerate discovery and development of new-age devices

    4-D Tomographic Inference: Application to SPECT and MR-driven PET

    Get PDF
    Emission tomographic imaging is framed in the Bayesian and information theoretic framework. The first part of the thesis is inspired by the new possibilities offered by PET-MR systems, formulating models and algorithms for 4-D tomography and for the integration of information from multiple imaging modalities. The second part of the thesis extends the models described in the first part, focusing on the imaging hardware. Three key aspects for the design of new imaging systems are investigated: criteria and efficient algorithms for the optimisation and real-time adaptation of the parameters of the imaging hardware; learning the characteristics of the imaging hardware; exploiting the rich information provided by depthof- interaction (DOI) and energy resolving devices. The document concludes with the description of the NiftyRec software toolkit, developed to enable 4-D multi-modal tomographic inference

    Quantum statistical inference and communication

    Get PDF
    This thesis studies the limits on the performances of inference tasks with quantum data and quantum operations. Our results can be divided in two main parts. In the first part, we study how to infer relative properties of sets of quantum states, given a certain amount of copies of the states. We investigate the performance of optimal inference strategies according to several figures of merit which quantifies the precision of the inference. Since we are not interested in obtaining a complete reconstruction of the states, optimal strategies do not require to perform quantum tomography. In particular, we address the following problems: - We evaluate the asymptotic error probabilities of optimal learning machines for quantum state discrimination. Here, a machine receives a number of copies of a pair of unknown states, which can be seen as training data, together with a test system which is initialized in one of the states of the pair with equal probability. The goal is to implement a measurement to discriminate in which state the test system is, minimizing the error probability. We analyze the optimal strategies for a number of different settings, differing on the prior incomplete information on the states available to the agent. - We evaluate the limits on the precision of the estimation of the overlap between two unknown pure states, given N and M copies of each state. We find an asymptotic expansion of a Fisher information associated with the estimation problem, which gives a lower bound on the mean square error of any estimator. We compute the minimum average mean square error for random pure states, and we evaluate the effect of depolarizing noise on qubit states. We compare the performance of the optimal estimation strategy with the performances of other intuitive strategies, such as the swap test and measurements based on estimating the states. - We evaluate how many samples from a collection of N d-dimensional states are necessary to understand with high probability if the collection is made of identical states or they differ more than a threshold according to a motivated closeness measure. The access to copies of the states in the collection is given as follows: each time the agent ask for a copy of the states, the agent receives one of the states with some fixed probability, together with a different label for each state in the collection. We prove that the problem can be solved with O(pNd=2) copies, and that this scaling is optimal up to a constant independent on d;N; . In the second part, we study optimal classical and quantum communication rates for several physically motivated noise models. - The quantum and private capacities of most realistic channels cannot be evaluated from their regularized expressions. We design several degradable extensions for notable channels, obtaining upper bounds on the quantum and private capacities of the original channels. We obtain sufficient conditions for the degradability of flagged extensions of channels which are convex combination of other channels. These sufficient conditions are easy to verify and simplify the construction of degradable extensions. - We consider the problem of transmitting classical information with continuous variable systems and an energy constraint, when it is impossible to maintain a shared reference frame and in presence of losses. At variance with phase-insensitive noise models, we show that, in some regimes, squeezing improves the communication rates with respect to coherent state sources and with respect to sources producing up to two-photon Fock states. We give upper and lower bounds on the optimal coherent state rate and show that using part of the energy to repeatedly restore a phase reference is strictly suboptimal for high energies

    Molecular Imaging

    Get PDF
    The present book gives an exceptional overview of molecular imaging. Practical approach represents the red thread through the whole book, covering at the same time detailed background information that goes very deep into molecular as well as cellular level. Ideas how molecular imaging will develop in the near future present a special delicacy. This should be of special interest as the contributors are members of leading research groups from all over the world

    Development of Statistical Models for Functional Near-infrared Spectroscopy Data Analysis Incorporating Anatomical and Probe Registration Prior Information

    Get PDF
    Functional near-infrared spectroscopy (fNIRS) is a non-invasive technology that uses low-levels of non-ionizing light in the range of 650 -- 900 nm (red and near-infrared) to record changes in the optical absorption and scattering of tissue. In particular, oxy-hemoglobin (HbO) and deoxy-hemoglobin (HbR) have characteristic absorption spectra at these wavelengths, which are used to discriminate blood flow and oxygen metabolism changes. As compared with functional magnetic resonance imaging (fMRI), fNIRS is less costly, more portable, and allows for a wider range of experimental scenarios because it neither requires a dedicated scanner nor needs the subject to lay supine. Current challenges in fNIRS data analysis include: (i) a small change in brain anatomy or optical probe positioning can create huge differences in fNIRS measurements even though the underlying brain activity remains the same due to the existence of ``blind-spots"; (ii) fNIRS image reconstruction is a high-dimensional, under-determined, and ill-posed problem, in which there are thousands of parameters to estimate while only tens of measurements available and existing methods notably overestimate the false positive rate; (iii) brain anatomical information has rarely been used in current fNIRS data analyses. This dissertation proposes two new methods aiming to improve fNIRS data analysis and overcome these challenges -- one of which is a channel-space method based on anatomically defined region-of-interest (ROI) and the other one is an image reconstruction method incorporating anatomical and physiological prior information. The two methods are developed using advanced statistical models including a combination of regularization models and Bayesian hierarchical modeling. The performance of the two methods is validated via numerical simulations and evaluated using receiver operating characteristics (ROC)-based tools. The statistical comparisons with conventional methods suggest significant improvements
    • …
    corecore