24 research outputs found

    Efficient sampling strategies for x-ray micro computed tomography with an intensity-modulated beam

    Get PDF
    The term "cycloidal CT" refers to a family of efficient sampling strategies that can be applied to x-ray micro-computed tomography (CT) systems which operate with an intensity-modulated beam. Such a beam can be employed to provide access to a phase contrast channel and high spatial resolutions (a few um). Phase contrast can offer better image contrast of samples which have traditionally been "invisible” to x-rays due to their weak attenuation, and high resolutions help view crucial details in samples. Cycloidal sampling strategies provide images more quickly than the gold standard in the field ("dithering”). I conceived and compared four practical implementation strategies for cycloidal CT, three of which are "flyscans” (the sample moves continuously). Flyscans acquire images of similar resolution to dithering with no overheads, reducing acquisition time to exposure time. I also developed a "knife-edge” position tracking method which tracks subpixel motions of the sample stage. This information can be used to facilitate, automate, and improve the reconstruction of cycloidal data. I analysed the effects of different levels of dose on the signal-to-noise ratio (SNR) of an image acquired with cycloidal CT. The results show that cycloidal images yield the same SNR as dithered images with less dose, although a more extensive study is required. Finally, I explored the potential of using cycloidal CT for intraoperative specimen imaging and tissue engineering. My results are encouraging for tissue engineering; for intraoperative imaging, the cycloidal images did not show comparable resolution to the dithered images, although that is possibly linked to issues with the dataset. Overall, my work has provided a benchmark for the implementation and application of cycloidal CT for the first time. Besides a summary of my research, this thesis is meant to be a comprehensive guide for facilitating uptake of cycloidal CT within the scientific community and beyond

    New reconstruction strategies for polyenergetic X-ray computer tomography

    Get PDF
    Mención Internacional en el título de doctorX-ray computed tomography (CT) provides a 3D representation of the attenuation coefficients of patient tissues, which are roughly decreasing functions of energy in the usual range of energies used in clinical and preclinical scenarios (from 30 KeV to 150 KeV). Commercial scanners use polychromatic sources, producing a beam having a range of photon energies, because no X-ray lasers exist as a usable alternative. Due to the energy dependence of the attenuation coefficients, low-energy photons are preferably absorbed, causing a shift of the mean energy of the X-ray beam to higher values; this effect is known as beam hardening. Classical reconstruction methods assume a monochromatic source and do not take into account the polychromatic nature of the spectrum, producing two artifacts in the reconstructed image: 1) cupping in large homogeneous areas and 2) dark bands between dense objects such as bone. These artifacts hinder a correct visualization of the image and the recovery of the true attenuation coefficient values. A fast correction of the beam-hardening artifacts can be performed with the so-called post-processing methods, which use the information of a segmentation obtained in a preliminary reconstruction. Nevertheless, this segmentation may fail in low-dose scenarios, leading to an increase of the artifacts. An alternative for these scenarios is the use of iterative methods that incorporate a beam-hardening model, at a cost of higher of computational time compared to post-processing methods. All previously proposed methods require either knowledge of the X-ray spectrum, which is not always available, or the heuristic selection of some parameters, which have been shown not to be optimal for the correction of different slices in heterogeneous studies. This thesis is framed in a research line focused on improving radiology systems of the Biomedical Imaging and Instrumentation Group (BiiG) from the Bioengineering and Aerospace Department of Universidad Carlos III de Madrid. This research line is carried out in collaboration with the Unidad de Medicina y Cirugía experimental of Hospital Gregorio Marañón through Instituto de Investigación Sanitaria Gregorio Marañón, the Electrical Engineering and Computer Science (EECS) department of the University of Michigan and SEDECAL, a Spanish company among the ten best world companies in medical imaging that exports medical devices to 130 countries. As part of this research line, a high-resolution micro-CT was developed for small-animal samples, which operates at low voltages, leading to strong beam-hardening artifacts. This scanner allows preclinical studies to be carried out, which can be divided into cross-sectional and longitudinal studies. Since cross-sectional studies consist of one acquisition at a specific point in time, radiation dose is not an issue, allowing for the use of standard-dose protocols with good image quality. In contrast, longitudinal studies consist of several acquisitions over time, so it is advisable to use low-dose protocols, despite the reduction of signal to noise ratio and the risk of artifacts in the image. This thesis presents a bundle of reconstruction strategies to cope with the beam-hardening effect in different dose scenarios, overcoming the problems of methods previously proposed in the literature. Since image quality is not an issue in the standard-dose scenarios, the speed of the strategies becomes a priority, advising for post-processing strategies. The main advantage of the proposed post-processing strategy is the inclusion of empirical models of the beam-hardening effect, either through a simple calibration phantom or through the information provided by the sample, which eliminates the need of the knowledge of the spectrum or tunning parameters. The evaluation against previously proposed correction methods with real and simulated data showed a good artifact compensation for a standarddose scenario (cross-sectional studies), while not optimum in a low-dose scenario, as expected. For longitudinal studies, where the reduction of dose delivered to the sample is advisable, this thesis presents an iterative method that incorporates the mentioned experimental beam-hardening models. The evaluation with real and simulated data and different dose scenarios showed excellent results but with the known drawback of high computational time. Finally, a deep-learning approach was explored with the idea of looking for a joint solution that would require low-computational time and, at the same time, compensate the beam-hardening artifacts regardless the dose scenario. The chosen architecture is U-net++, based on an encoder-decoder, with the mean-squared error as the cost function. Results in real data showed a good compensation of the beam-hardening and low-dose artifacts with a considerable reduction of time, rising the interest of further exploring this path in the future. The incorporation of these reconstruction strategies in real scanners is straightforward, only requiring a small modification of the calibration step already implemented in commercial scanners. The methods are being transferred to the company SEDECAL for their implementation in the new generation of micro-CT scanners for preclinical research and a multipurpose C-arm for veterinary applications.Programa de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan CarlosPresidente: Jorge Ripoll Lorenzo.- Secretario: José Vicente Manjón Herrera.- Vocal: Adam M. Alessi

    DEEP LEARNING IN COMPUTER-ASSISTED MAXILLOFACIAL SURGERY

    Get PDF

    Artefact Reduction Methods for Iterative Reconstruction in Full-fan Cone Beam CT Radiotherapy Applications

    Get PDF
    A cone beam CT (CBCT) system acquires two-dimensional projection images of an imaging object from multiple angles in one single rotation and reconstructs the object geometry in three dimensions for volumetric visualization. It is mounted on most modern linear accelerators and is routinely used in radiotherapy to verify patient positioning, monitor patient contour changes throughout the course of treatment, and enable adaptive radiotherapy planning. Iterative image reconstruction algorithms use mathematical methods to iteratively solve the reconstruction problem. Iterative algorithms have demonstrated improvement in image quality and / or reduction in imaging dose over traditional filtered back-projection (FBP) methods. However, despite the advancement in computer technology and growing availability of open-source iterative algorithms, clinical implementation of iterative CBCT has been limited. This thesis does not report development of codes for new iterative image reconstruction algorithms. It focuses on bridging the gap between the algorithm and its implementation by addressing artefacts that are the results of imperfections from the raw projections and from the imaging geometry. Such artefacts can severely degrade image quality and cannot be removed by iterative algorithms alone. Practical solutions to solving these artefacts will be presented and this in turn will better enable clinical implementation of iterative CBCT reconstruction

    Deep Learning in Medical Image Analysis

    Get PDF
    The accelerating power of deep learning in diagnosing diseases will empower physicians and speed up decision making in clinical environments. Applications of modern medical instruments and digitalization of medical care have generated enormous amounts of medical images in recent years. In this big data arena, new deep learning methods and computational models for efficient data processing, analysis, and modeling of the generated data are crucially important for clinical applications and understanding the underlying biological process. This book presents and highlights novel algorithms, architectures, techniques, and applications of deep learning for medical image analysis

    Contributions to the improvement of image quality in CBCT and CBμCT and application in the development of a CBμCT system

    Get PDF
    During the last years cone-beam x-ray CT (CBCT) has been established as a widespread imaging technique and a feasible alternative to conventional CT for dedicated imaging tasks for which the limited flexibility offered by conventional CT advises the development of dedicated designs. CBCT systems are starting to be routinely used in image guided radiotherapy; image guided surgery using C-arms; scan of body parts such as the sinuses, the breast or extremities; and, especially, in preclinical small-animal imaging, often coupled to molecular imaging systems. Despite the research efforts advocated to the advance of CBCT, the challenges introduced by the use of large cone angles and two-dimensional detectors are a field of vigorous research towards the improvement of CBCT image quality. Moreover, systems for small-animal imaging add to the challenges posed by clinical CBCT the need of higher resolution to obtain equivalent image quality in much smaller subjects. This thesis contributes to the progress of CBCT imaging by addressing a variety of issues affecting image quality in CBCT in general and in CBCT for small-animal imaging (CBμCT). As part of this work we have assessed and optimized the performance of CBμCT systems for different imaging tasks. To this end, we have developed a new CBμCT system with variable geometry and all the required software tools for acquisition, calibration and reconstruction. The system served as a tool for the optimization of the imaging process and for the study of image degradation effects in CBμCT, as well as a platform for biological research using small animals. The set of tools for the accurate study of CBCT was completed by developing a fast Monte Carlo simulation engine based on GPUs, specifically devoted to the realistic estimation of scatter and its effects on image quality in arbitrary CBCT configurations, with arbitrary spectra, detector response, and antiscatter grids. This new Monte Carlo engine outperformed current simulation platforms by more than an order of magnitude. Due to the limited options for simulation of spectra in microfocus x-ray sources used in CBμCT, we contributed in this thesis a new spectra generation model based on an empirical model for conventional radiology and mammography sources modified in accordance to experimental data. The new spectral model showed good agreement with experimental exposure and attenuation data for different materials. The developed tools for CBμCT research were used for the study of detector performance in terms of dynamic range. The dynamic range of the detector was characterized together with its effect on image quality. As a result, a new simple method for the extension of the dynamic range of flat-panel detectors was proposed and evaluated. The method is based on a modified acquisition process and a mathematical treatment of the acquired data. Scatter is usually identified as one of the major causes of image quality degradation in CBCT. For this reason the developed Monte Carlo engine was applied to the in-depth study of the effects of scatter for a representative range of CBCT embodiments used in the clinical and preclinical practice. We estimated the amount and spatial distribution of the total scatter fluence and the individual components within. The effect of antiscatter grids in improving image quality and in noise was also evaluated. We found a close relation between scatter and the air gap of the system, in line with previous results in the literature. We also observed a non-negligible contribution of forward-directed scatter that is responsible to a great extent for streak artifacts in CBCT. The spatial distribution of scatter was significantly affected by forward scatter, somewhat challenging the usual assumption that the scatter distribution mostly contains low-frequencies. Antiscatter grids showed to be effective for the reduction of cupping, but they showed a much lower performance when dealing with streaks and a shift toward high frequencies of the scatter distributions. --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------A lo largo de los últimos años, el TAC de rayos X de haz cónico (CBCT, de “conebeam” CT) se ha posicionado como una de las técnicas de imagen más ampliamente usadas. El CBCT se ha convertido en una alternativa factible al TAC convencional en tareas de imagen específicas para las que la flexibilidad limitada ofrecida por este hace recomendable el desarrollo de sistemas de imagen dedicados. De esta forma, el CBCT está empezando a usarse de forma rutinaria en varios campos entre los que se incluyen la radioterapia guiada por imagen, la cirugía guiada por imagen usando arcos en C, imagen de partes de la anatomía en las que el TAC convencional no es apropiado, como los senos nasales, las extremidades o la mama, y, especialmente el campo de imagen preclínica con pequeño animal. Los sistemas CBCT usados en este último campo se encuentran habitualmente combinados con sistemas de imagen molecular. A pesar del trabajo de investigación dedicado al avance de la técnica CBCT en los últimos años, los retos introducidos por el uso de haces cónicos y de detectores bidimensionales son un campo candente para la investigación médica, con el objetivo de obtener una calidad de imagen equivalente o superior a la proporcionada por el TAC convencional. En el caso de imagen preclínica, a los retos generados por el uso de CBCT se une la necesidad de una mayor resolución de imagen que permita observar estructuras anatómicas con el mismo nivel de detalle obtenido para humanos. Esta tesis contribuye al progreso del CBCT mediante el estudio de usa serie de efectos que afectan a la calidad de imagen de CBCT en general y en el ámbito preclínico en particular. Como parte de este trabajo, hemos evaluado y optimizado el rendimiento de sistemas CBCT preclínicos en función de la tarea de imagen concreta. Con este fin se ha desarrollado un sistema CBCT para pequeños animales con geometría variable y todas las herramientas necesarias para la adquisición, calibración y reconstrucción de imagen. El sistema sirve como base para la optimización de protocolos de adquisición y para el estudio de fuentes de degradación de imagen además de constituir una plataforma para la investigación biológica en pequeño animal. El conjunto de herramientas para el estudio del CBCT se completó con el desarrollo de una plataforma acelerada de simulación Monte Carlo basada en GPUs, optimizada para la estimación de radiación dispersa en CBCT y sus efectos en la calidad de imagen. La plataforma desarrollada supera el rendimiento de las actuales en más de un orden de magnitud y permite la inclusión de espectros policromáticos de rayos X, de la respuesta realista del detector y de rejillas antiscatter. Debido a las escasas opciones ofrecidas por la literatura para la estimación de espectros de rayos X para fuentes microfoco usadas en imagen preclínica, en esta tesis se incluye el desarrollo de un nuevo modelo de generación de espectros, basado en un modelo existente para fuentes usadas en radiología y mamografía. El modelo fue modificado a partir de datos experimentales. La precisión del modelo presentado se comprobó mediante datos experimentales de exposición y atenuación para varios materiales. Las herramientas desarrolladas se usaron para estudiar el rendimiento de detectores de rayos tipo flat-panel en términos de rango dinámico, explorando los límites impuestos por el mismo en la calidad de imagen. Como resultado se propuso y evaluó un método para la extensión del rango dinámico de este tipo de detectores. El método se basa en la modificación del proceso de adquisición de imagen y en una etapa de postproceso de los datos adquiridos. El simulador Monte Carlo se empleó para el estudio detallado de la naturaleza, distribución espacial y efectos de la radiación dispersa en un rango de sistemas CBCT que cubre el espectro de aplicaciones propuestas en el entorno clínico y preclínico. Durante el estudio se inspeccionó la cantidad y distribución espacial de radiación dispersa y de sus componentes individuales y el efecto causado por la inclusión de rejillas antiscatter en términos de mejora de calidad de imagen y de ruido en la imagen. La distribución de radiación dispersa mostró una acentuada relación con la distancia entre muestra y detector en el equipo, en línea con resultados publicados previamente por otros autores. También se encontró una influencia no despreciable de componentes de radiación dispersa con bajos ángulos de desviación, poniendo en tela de juicio la tradicional asunción que considera que la distribución espacial de la radiación dispersa está formada casi exclusivamente por componentes de muy baja frecuencia. Las rejillas antiscatter demostraron ser efectivas para la reducción del artefacto de cupping, pero su efectividad para tratar artefactos en forma de línea (principalmente formados por radiación dispersa con bajo ángulo de desviación) resultó mucho menor. La inclusión de estas rejillas también enfatiza las componentes de alta frecuencia de la distribución espacial de la radiación dispersa

    Simulation study of a wide axial field of view positron emission tomography system based on resistive plate chamber detectors

    Get PDF
    Tese de doutoramento em Física, Especialidade de Física Tecnológica, apresentada à Faculdade de Ciências e Tecnologia da Universidade de CoimbraO presente trabalho teve por objectivo avaliar, através de simulações em GEANT4, os parâmetros de desempenho de um tomógrafo de Tomografia por Emissão de Positrões (PET) com 2400 mm de comprimento do Campo de Visão Axial (AFOV), baseado em detectores do tipo Câmaras de Placas Resistivas (RPC). Para estabelecer uma base de comparação para esse estudo, investigou se a dependência da sensibilidade a coincidências verdadeiras de um tomógrafo baseado em detectores de BGO em função do ângulo polar de aceitação e do comprimento do AFOV, segundo as normas NEMA NU2 1994. O tomógrafo foi definido como um anel de tungsténio de modo a obter os pontos de entrada dos fotões, tendo-se usado factores de correcção para ter em conta a fracção de empacotamento e a eficiência de detecção em função da segmentação do tomógrafo GE Advance®. Concluíu se que a sensibilidade para eventos verdadeiros é dominada pelo ângulo sólido, aumentando significativamente com o comprimento do AFOV e com o ângulo polar de aceitação, enquanto a Fracção de Radiação Dispersa (SF), se revelou quase independente da geometria, dependendo no entanto do ângulo polar de aceitação. A sensibilidade para coincidências verdadeiras obtida para um AFOV de 2400 mm e plena aceitação no ângulo polar foi cerca de 100 vezes maior que a do tomógrafo GE Advance®. Complementarmente desenvolveu se um modelo analítico simples para a sensibilidade a coincidências verdadeiras que revelou um acordo razoável com os dados de simulação. De seguida fez-se um estudo semelhante para um tomógrafo baseado em detectores de RPC, tendo o mesmo sido definido de maneira análoga. As eficiências de detecção foram obtidas simulando uma pilha de 121 placas de vidro (400 µm de espessura) separadas por 120 camadas de gás (350 µm de espessura). Verificou se que a sensibilidade para coincidências verdadeiras seguia a mesma tendência apresentada pelo tomógrafo baseado em detectores de BGO, atingindo para os 2400 mm de AFOV e plena aceitação no ângulo polar uma sensibilidade cerca de 20 (5) vezes mais elevada do que no caso do tomógrafo GE Advance® usando um ganho de TOF de 4,4 (sem ganho de TOF), e uma SF de 46,4%, excluindo a dispersão no detector. Procedeu se então a simulações detalhadas com vista à optimização do detector RPC a usar num tomógrafo PET de forma paralelepipédica definido por quatro cabeças de detecção com um AFOV de 2400 mm. Cada cabeça de detecção contém uma pilha de 20 detectores RPC, cada um com dois módulos de detecção com 5 camadas de gás (350 µm de espessura) delimitados por 6 placas de vidro. Definiram se os materiais e as espessuras para as camadas isoladoras, os eléctrodos de alta tensão e os de recolha de sinal, tendo se obtido um espessura óptima de 200 µm para as placas de vidro para detecção de fotões de 511 keV, e uma fracção de eventos mal identificados de 32% para uma distância das Linhas de Resposta (LORs) ao ponto de aniquilação igual ou inferior a 2 mm. Estudou-se também a resolução espacial do referido tomógrafo com simulações detalhadas em GEANT4. Os dados de simulação foram processados por forma a ter em conta a electrónica de leitura dos protótipos de RPCs desenvolvidos para teste. As coincidências foram efectuadas recorrendo a um classificador de coincidências de janela temporal simples, aceitando se as LORs com ângulo polar igual ou inferior a 9º. Avaliou se a resolução especial de acordo com as recomendações da norma NEMA NU2 2001, mas considerando apenas uma fonte pontual com 1 µm de diâmetro localizada no centro de uma esfera de polymetil metacrilato com 2 mm de diâmetro, que foi posicionada no plano transaxial central, desviada 100 mm do eixo segundo as duas direcções do referido plano. A resolução espacial encontrada foi de 0,9, 1,4 e 2,1 mm, respectivamente, para um segmentação do detector de 0, 1 e 2 mm nas direcções transaxial e axial, e de 3,44 mm na direcção radial. Estudou se ainda a SF, as taxas de contagem e a Taxa de Contagem Equivalente de Ruído (NECR) do tomógrafo já descrito, seguindo as normas NEMA NU2 2001. Os dados de simulação foram processados de modo a ter em conta a electrónica de leitura do detector. O melhor esquema de processamento para optimizar o NECR consistiu em efectuar as coincidências recorrendo a um classificador de coincidências do tipo janela temporal múltipla, aceitação total no ângulo polar, rejeição das LORs cujo o ponto reconstruído directamente por TOF cai fora de uma região de interesse com 2 cm de margem relativamente às dimensões do fantoma, e aceitando todos os possíveis pares de coincidências, incluindo todas as combinação possíveis retiradas das coincidências múltiplas. Considerando um tempo morto para a leitura em posição de 3.0 µs e para o fantoma NEMA NU2 2001, obteve se uma SF de 51.8% e um pico de NECR de 167 kcps a ~7.6 kBq/cm3. Para um fantoma similar mas estendido axialmente até aos1800 mm, a SF obtida foi de 53.7% e o pico de NECR de ~164 kcps a ~3.0 kBq/cm3

    Single photon emission computed tomography: performance assessment, development and clinical applications

    Get PDF
    This is a general investigation of the SPECT imaging process. The primary aim is to determine the manner in which the SPECT studies should be performed in order to maximise the relevant clinical information given the characteristics and limitations of the particular gamma camera imaging system used. Thus the first part of this thesis is concerned with an assessment of the performance characteristics of the SPECT system itself. This involves the measurement of the fundamental planar imaging properties of the camera, their stability with rotation, the ability of the camera to rotate in a perfect circle and the accuracy of the transfer of the information from the camera to the computing system. Following this the performance of the SPECT system as a whole is optimised. This is achieved by examining the fundamental aspects of the SPECT imaging process and by optimising the selection of the parameters chosen for the acquisition and reconstruction of the data. As an aid to this a novel mathematical construct is introduced. By taking the logarithm of the power spectrum of the normalised projection profile data the relationship between the signal power and the noise power in the detected data can be visualised. From a theoretical consideration of the available options the Butterworth filter is chosen for use because it provides the best combination of spatial frequency transfer characteristics and flexibility. The flexibility of the Butterworth filter is an important feature because it means that the form of the actual function used in the reconstruction of a transaxial section can be chosen with regard to the relationship between the signal and the noise in the data. A novel method is developed to match the filter to the projection data. This consists of the construction of a mean angular power spectrum from the set of projection profiles required for the reconstruction of the particular transaxial section in question. From this the spatial frequency at which the the signal becomes dominated by the noise is identified. The value which the Butterworth filter should take at this point can then be determined with regard to the requirements of the particular clinical investigation to be performed. The filter matching procedure can be extended to two dimensions in a practical manner by operating on the projection data after it has been filtered in the y direction. The efficacy of several methods to correct for the effects of scatter, attenuation and camera non-uniformity are also investigated. Having developed the optimised methodology for the acquisition and reconstruction of the SPECT data the results which are obtained are applied in the investigation of some specific clinical problems. The assessment of intractable epilepsy using 99mTc-HMPAO is performed followed by the investigation of ischaemic heart disease using 99mTc-MIBI and finally, the diagnosis of avascular necrosis of the femoral head using 99mTc-MDP is studied. The SPECT studies described in this thesis make a significant contribution to patient management
    corecore