319 research outputs found

    Advanced tomographic image reconstruction algorithms for Diffuse Optical Imaging

    Get PDF
    Diffuse Optical Imaging is relatively new set of imaging modality that use infrared and near infrared light to characterize the optical properties of biological tissue. The technology used is less expensive than other imaging modalities such as X-ray mammography, it is portable and can be used to monitor brain activation and cancer diagnosis, besides to aid to other imaging modalities and therapy treatments in the characterization of diseased tissue, i. e. X-ray, Magnetic Resonance Imaging and Radio Frequency Ablation. Due the optical properties of biological tissue near-infrared light is highly scattered, as a consequence, a limited amount of light is propagated thus making the image reconstruction process very challenging. Typically, diffuse optical image reconstructions require from several minutes to hours to produce an accurate image from the interaction of the photons and the chormophores of the studied medium. To this day, this limitation is still under investigation and there are several approaches that are close to the real-time image reconstruction operation. Diffuse Optical Imaging includes a variety of techniques such as functional Near-Infrared Spectroscopy (fNIRS), Diffuse Optical Tomography (DOT), Fluorescence Diffuse Optical Tomography (FDOT) and Spatial Frequency Domain imaging (SFDI). These emerging image reconstruction modalities aim to become routine modalities for clinical applications. Each technique presents their own advantages and limitations, but they have been successfully used in clinical trials such as brain activation analysis and breast cancer diagnosis by mapping the response of the vascularity within the tissue through the use of models that relate the interaction between the tissue and the path followed by the photons. One way to perform the image reconstruction process is by separating it in two stages: the forward problem and the inverse problem; the former is used to describe light propagation inside a medium and the latter is related to the reconstruction of the spatio-temporal distribution of the photons through the tissue. Iterative methods are used to solve both problems but the intrinsic complexity of photon transport in biological tissue makes the problem time-consuming and computationally expensive. The aim of this research is to apply a fast-forward solver based on reduced order models to Fluorescence Diffuse Optical Tomography and Spatial Frequency Domain Imaging to contribute to these modalities in their application of clinical trials. Previous work showed the capabilities of the reduced order models for real-time reconstruction of the absorption parameters in the brain of mice. Results demonstrated insignificant loss of quantitative and qualitative accuracy and the reconstruction was performed in a fraction of the time normally required on this kind of studies. The forward models proposed in this work, offer the capability to run three-dimensional image reconstructions in CPU-based computational systems in a fraction of the time required by image reconstructions methods that use meshes generated using the Finite Element Method. In the case of SFMI, the proposed approach is fused with the approach of the virtual sensor for CCD cameras to reduce the computational burden and to generate a three-dimensional map of the distribution of tissue optical properties. In this work, the use case of FDOT focused on the thorax of a mouse model with tumors in the lungs as the medium under investigation. The mouse model was studied under two- and three- dimension conditions. The two-dimensional case is presented to explain the process of creating the Reduced-Order Models. In this case, there is not a significant improvement in the reconstruction considering NIRFAST as the reference. The proposed approach reduced the reconstruction time to a quarter of the time required by NIRFAST, but the last one performed it in a couple of seconds. In contrast, the three-dimensional case exploited the capabilities of the Reduced-Order Models by reducing the time of the reconstruction from a couple of hours to several seconds, thus allowing a closer real-time reconstruction of the fluorescent properties of the interrogated medium. In the case of Spatial Frequency Domain Imaging, the use case considered a three-dimensional section of a human head that is analysed using a CCD camera and a spatially modulated light source that illuminates the mentioned head section. Using the principle of the virtual sensor, different regions of the CCD camera are clustered and then Reduced Order Models are generated to perform the image reconstruction of the absorption distribution in a fraction of the time required by the algorithm implemented on NIRFAST. The ultimate goal of this research is to contribute to the field of Diffuse Optical Imaging and propose an alternative solution to be used in the reconstruction process to those models already used in three-dimensional reconstructions of Fluorescence Diffuse Optical Tomography and Spatial Frequency Domain Imaging, thus offering the possibility to continuously monitor tissue obtaining results in a matter of seconds

    The use of primitives in the calculation of radiative view factors

    Get PDF
    Compilations of radiative view factors (often in closed analytical form) are readily available in the open literature for commonly encountered geometries. For more complex three-dimensional (3D) scenarios, however, the effort required to solve the requisite multi-dimensional integrations needed to estimate a required view factor can be daunting to say the least. In such cases, a combination of finite element methods (where the geometry in question is sub-divided into a large number of uniform, often triangular, elements) and Monte Carlo Ray Tracing (MC-RT) has been developed, although frequently the software implementation is suitable only for a limited set of geometrical scenarios. Driven initially by a need to calculate the radiative heat transfer occurring within an operational fibre-drawing furnace, this research set out to examine options whereby MC-RT could be used to cost-effectively calculate any generic 3D radiative view factor using current vectorisation technologies

    High-performance time-series quantitative retrieval from satellite images on a GPU cluster

    Get PDF
    The quality and accuracy of remote sensing instruments continue to increase, allowing geoscientists to perform various quantitative retrieval applications to observe the geophysical variables of land, atmosphere, ocean, etc. The explosive growth of time-series remote sensing (RS) data over large-scales poses great challenges on managing, processing, and interpreting RS ‘‘Big Data.’’ To explore these time-series RS data efficiently, in this paper, we design and implement a high-performance framework to address the time-consuming time-series quantitative retrieval issue on a graphics processing unit cluster, taking the aerosol optical depth (AOD) retrieval from satellite images as a study case. The presented framework exploits the multilevel parallelism for time-series quantitative RS retrieval to promote efficiency. At the coarse-grained level of parallelism, the AOD time-series retrieval is represented as multidirected acyclic graph workflows and scheduled based on a list-based heuristic algorithm, heterogeneous earliest finish time, taking the idle slot and priorities of retrieval jobs into account. At the fine-grained level, the parallel strategies for the major remote sensing image processing algorithms divided into three categories, i.e., the point or pixel-based operations, the local operations, and the global or irregular operations have been summarized. The parallel framework was implemented with message passing interface and compute unified device architecture, and experimental results with the AOD retrieval case verify the effectiveness of the presented framework.N/

    Simulated Annealing

    Get PDF
    The book contains 15 chapters presenting recent contributions of top researchers working with Simulated Annealing (SA). Although it represents a small sample of the research activity on SA, the book will certainly serve as a valuable tool for researchers interested in getting involved in this multidisciplinary field. In fact, one of the salient features is that the book is highly multidisciplinary in terms of application areas since it assembles experts from the fields of Biology, Telecommunications, Geology, Electronics and Medicine

    Frequency domain high density diffuse optical tomography for functional brain imaging

    Get PDF
    Measurements of dynamic near-infrared (NIR) light attenuation across the human head together with model-based image reconstruction algorithms allow the recovery of three-dimensional spatial brain activation maps. Previous studies using high-density diffuse optical tomography (HD-DOT) systems have reported improved image quality over sparse arrays. Modulated NIR light, known as Frequency Domain (FD) NIR, enables measurements of phase shift along with amplitude attenuation. It is hypothesised that the utilization of these two sets of complementary data (phase and amplitude) for brain activity detection will result in an improvement in reconstructed image quality within HD-DOT. However, parameter recovery in DOT is a computationally expensive algorithm, especially when FD-HD measurements are required over a large and complex volume, as in the case of brain functional imaging. Therefore, computational tools for the light propagation modelling, known as the forward model, and the parameter recovery, known as the inverse problem, have been developed, in order to enable FD-HD-DOT. The forward model, within a diffusion approximation-based finite-element modelling framework, is accelerated by employing parallelization. A 10-fold speed increase when GPU architectures are available is achieved while maintaining high accuracy. For a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, light propagation can be calculated at ∼0.25s per excitation source. Additionally, a framework for the sparse formulation of the inverse model, incorporating parallel computing, is proposed, achieving a 10-fold speed increase and a 100-fold memory efficiency, whilst maintaining reconstruction quality. Finally, to evaluate image reconstruction with and without the additional phase information, point spread functions have been simulated across a whole-scalp field of view in 24 subject-specific anatomical models using an experimentally derived noise model. The addition of phase information has shown to improve the image quality by reducing localization error by up to 59%, effective resolution by up to 21%, and depth penetration up to 5mm, as compared to using the intensity attenuation measurements alone. In addition, experimental data collected during a retinotopic experiment reveal that the phase data contains unique information about brain activity and enables images to be resolved for deeper brain regions

    Modeling Atmospheric Lines By the Exoplanet Community (MALBEC) version 1.0: A CUISINES radiative transfer intercomparison project

    Full text link
    Radiative transfer (RT) models are critical in the interpretation of exoplanetary spectra, in simulating exoplanet climates and when designing the specifications of future flagship observatories. However, most models differ in methodologies and input data, which can lead to significantly different spectra. In this paper, we present the experimental protocol of the MALBEC (Modeling Atmospheric Lines By the Exoplanet Community) project. MALBEC is an exoplanet model intercomparison project (exoMIP) that belongs to the CUISINES (Climates Using Interactive Suites of Intercomparisons Nested for Exoplanet Studies) framework which aims to provide the exoplanet community with a large and diverse set of comparison and validation of models. The proposed protocol tests include a large set of initial participating RT models, a broad range of atmospheres (from Hot Jupiters to temperate terrestrials) and several observation geometries, which would allow us to quantify and compare the differences between different RT models used by the exoplanetary community. Two types of tests are proposed: transit spectroscopy and direct imaging modeling, with results from the proposed tests to be published in dedicated follow-up papers. To encourage the community to join this comparison effort and as an example, we present simulation results for one specific transit case (GJ-1214 b), in which we find notable differences in how the various codes handle the discretization of the atmospheres (e.g., sub-layering), the treatment of molecular opacities (e.g., correlated-k, line-by-line) and the default spectroscopic repositories generally used by each model (e.g., HITRAN, HITEMP, ExoMol)

    Report from the Tri-Agency Cosmological Simulation Task Force

    Full text link
    The Tri-Agency Cosmological Simulations (TACS) Task Force was formed when Program Managers from the Department of Energy (DOE), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF) expressed an interest in receiving input into the cosmological simulations landscape related to the upcoming DOE/NSF Vera Rubin Observatory (Rubin), NASA/ESA's Euclid, and NASA's Wide Field Infrared Survey Telescope (WFIRST). The Co-Chairs of TACS, Katrin Heitmann and Alina Kiessling, invited community scientists from the USA and Europe who are each subject matter experts and are also members of one or more of the surveys to contribute. The following report represents the input from TACS that was delivered to the Agencies in December 2018.Comment: 36 pages, 3 figures. Delivered to NASA, NSF, and DOE in Dec 201

    Radiation techniques for urban thermal simulation with the Finite Element Method

    Get PDF
    Modern societies are increasingly organized in cities. In the present times, more than half of the world’s population lives in urban settlements. In this context, architectural and building scale works have the need of extending their scope to the urban environment. One of the main challenges of these times is understanting all the thermal exchanges that happen in the city. The radiative part appears as the less developed one; its characterization and interaction with built structures has gained attention for building physics, architecture and environmental engineering. Providing a linkage between these areas, the emerging field of urban physics has become important for tackling studies of such nature. Urban thermal studies are intrinsically linked to multidisciplinary work approaches. Performing full-scale measurements is hard, and prototype models are difficult to develop. Therefore, computational simulations are essential in order to understand how the city behaves and to evaluate projected modifications. The methodological and algorithmic improvement of simulation is one of the mainlines of work for computational physics and many areas of computer science. The field of computer graphics has addressed the adaptation of rendering algorithms to daylighting using physically-based radiation models on architectural scenes. The Finite Element Method (FEM) has been widely used for thermal analysis. The maturity achieved by FEM software allows for treating very large models with a high geometrical detail and complexity. However, computing radiation exchanges in this context implies a hard computational challenge, and forces to push the limits of existing physical models. Computer graphics techniques can be adapted to FEM to estimate solar loads. In the thermal radiation range, the memory requirements for storing the interaction between the elements grows because all the urban surfaces become radiation sources. In this thesis, a FEM-based methodology for urban thermal analysis is presented. A set of radiation techniques (both for solar and thermal radiation) are developed and integrated into the FEM software Cast3m. Radiosity and ray tracing are used as the main algorithms for radiation computations. Several studies are performed for different city scenes. The FEM simulation results are com-pared with measured temperature results obtained by means of urban thermography. Post-processing techniques are used to obtain rendered thermograms, showing that the proposed methodology pro-duces accurate results for the cases analyzed. Moreover, its good computational performance allows for performing this kind of study using regular desktop PCs.Las sociedades modernas están cada vez más organizadas en ciudades. Más de la mitad de la población mundial vive en asentamientos urbanos en la actualidad. En este contexto, los trabajos a escala arquitectónica y de edificio deben extender su alcance al ambiente urbano. Uno de los mayores desafíos de estos tiempos consiste en entender todos los intercambios térmicos que suceden en la ciudad. La parte radiativa es la menos desarrollada; su caracterización y su interacción con edificaciones ha ganado la atención de la física de edificios, la arquitectura y la ingeniería ambiental. Como herramienta de conexión entre estas áreas, la física urbana es un área que resulta importante para atacar estudios de tal naturaleza. Los estudios térmicos urbanos están intrinsecamente asociados a trabajos multidisciplinarios. Llevar a cabo mediciones a escala real resulta difícil, y el desarrollo de prototipos de menor escala es complejo. Por lo tanto, la simulación computacional es esencial para entender el comportamiento de la ciudad y para evaluar modificaciones proyectadas. La mejora metodológica y algorítmica de las simulaciones es una de las mayores líneas de trabajo para la física computacional y muchas áreas de las ciencias de la computación. El área de la computación gráfica ha abordado la adaptación de algoritmos de rendering para cómputo de iluminación natural, utilizando modelos de radiación basados en la física y aplicándolos sobre escenas arquitectónicas. El Método de Elementos Finitos (MEF) ha sido ampliamente utilizado para análisis térmico. La madurez alcanzada por soluciones de software MEF permite tratar grandes modelos con un alto nivel de detalle y complejidad geométrica. Sin embargo, el cómputo del intercambio radiativo en este contexto implica un desafío computacional, y obliga a empujar los límites de las descripciones físicas conocidas. Algunas técnicas de computación gráfica pueden ser adaptadas a MEF para estimar las cargas solares. En el espectro de radiación térmica, los requisitos de memoria necesarios para almacenar la interacción entre los elementos crecen debido a que todas las superficies urbanas se transforman en fuentes emisoras de radiación. En esta tesis se presenta una metodología basada en MEF para el análisis térmico de escenas urbanas. Un conjunto de técnicas de radiación (para radiación solar y térmica) son desarrolladas e integradas en el software MEF Cast3m. Los algoritmos de radiosidad y ray tracing son utilizados para el cómputo radiativo. Se presentan varios estudios que utilizan diferentes modelos de ciudades. Los resultados obtenidos mediante MEF son comparados con temperaturas medidas por medio de termografías urbanas. Se utilizan técnicas de post-procesamiento para renderizar imágenes térmicas, que permiten concluir que la metodología propuesta produce resultados precisos para los casos analizados. Asimismo, su buen desempeño computacional posibilita realizar este tipo de estudios en computadoras personales
    corecore