363 research outputs found

    Virtual clinical trials in medical imaging: a review

    Get PDF
    The accelerating complexity and variety of medical imaging devices and methods have outpaced the ability to evaluate and optimize their design and clinical use. This is a significant and increasing challenge for both scientific investigations and clinical applications. Evaluations would ideally be done using clinical imaging trials. These experiments, however, are often not practical due to ethical limitations, expense, time requirements, or lack of ground truth. Virtual clinical trials (VCTs) (also known as in silico imaging trials or virtual imaging trials) offer an alternative means to efficiently evaluate medical imaging technologies virtually. They do so by simulating the patients, imaging systems, and interpreters. The field of VCTs has been constantly advanced over the past decades in multiple areas. We summarize the major developments and current status of the field of VCTs in medical imaging. We review the core components of a VCT: computational phantoms, simulators of different imaging modalities, and interpretation models. We also highlight some of the applications of VCTs across various imaging modalities

    Evaluation of Developments in PET Methodology

    Get PDF

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em Biofísica, apresentada à Universidade de Lisboa através da Faculdade de Ciências, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Modelos de observador aplicados a la detectabilidad de bajo contraste en tomografía computarizada

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Medicina, leída el 15/01/2016. Tesis formato europeo (compendio de artículos)Introduction. Medical imaging has become one of the comerstones in modem healthcare. Computed tomography (CT) is a widely used imaging modality in radiology worldwide. This technique allows to obtain three-dimensional volume reconstructions ofdifferent parts of the patient with isotropic spatial resolution. Also, to acquire sharp images of moving organs, such as the heart orthe lungs, without artifacts. The spectrum ofindications which can be tackled with this technique is wide, and it comprises brain perfusion, cardiology, oncology, vascular radiology, interventionism and traumatology, amongst others. CT is a very popular imaging technique, widely implanted in healthcare services worldwide. The amount of CT scans performed per year has been continuously growing in the past decades, which has led to a great benefit for the patients. At the same time, CT exams represent the highest contribution to the collective radiation dose. Patient dose in CT is one order ofmagnitude higher than in conventional X-ray studies. Regarding patient dose in X-ray imaging the ALARA criteria is universally accepted. It states that patient images should be obtained using adose as low as reasonably achievable and compatible with the diagnostic task. Sorne cases ofpatients' radiation overexposure, most ofthem in brain perfusion procedures have come to the public eye and hada great impact in the USA media. These cases, together with the increasing number ofCT scans performed per year, have raised a red flag about the patient imparted doses in CT. Several guidelines and recommendation for dose optimization in CT have been published by different organizations, which have been included in European and National regulations and adopted by CT manufacturers. In CT, the X-ray tube is rotating around the patient, emitting photons in beams from different angles or projections. These photons interact with the tissues in the patient, depending on their energy and the tissue composition and density. A fraction of these photons deposit all or part of their energy inside the patient, resulting in organs absorbed dose. The images are generated using the data from the projections ofthe X-ray beam that reach the detectors after passing through the patient. Each proj ection represents the total integrated attenuation of the X-ray beam along its path. A CT protocol is defined as a collection of settings which can be selected in the CT console and affect the image quality outcome and the patient dose. They can be acquisition parameters such as beam collimation, tube current, rotation time, kV, pitch, or reconstruction parameters such as the slice thickness and spacing, reconstruction filter and method (filtered back projection (FBP) or iterative algorithms). All main CT manufacturers offer default protocols for different indications, depending on the anatomical region. The user can frequently set the protocol parameters selecting amongst a range of values to adapt them to the clinical indication and patient characteristics, such as size or age. The selected settings in the protocol affect greatly image quality and dose. Many combinations ofsean parameters can render an appropriate image quality for a particular study. Protocol optimization is a complex task in CT because most sean protocol parameters are intertwined and affect image quality and patient dose...Introducción. La imagen médica se ha convertido en uno de los pilares en la atención sanitaria actual. La tomografía computarizada (TC) es una modalidad de imagen ampliamente extendida en radiología en todo el mundo. Esta técnica permite adquirir imágenes de órganos en movimiento, como el corazón o los pulmones, sin artefactos. También permite obtener reconstrucciones de volúmenes tridimensionales de distintas partes del cuerpo de los pacientes. El abanico de indicaciones que pueden abordarse con esta técnica es amplio, e incluye la perfusión cerebral, cardiología, oncología, radiología vascular, intervencionismo y traumatología, entre otras. La TC es una técnica de imagen muy popular, ampliamente implantada en los servicios de salud de hospitales de todo el mundo. El número de estudios de TC hechos anualmente ha crecido de manera continua en las últimas décadas, lo que ha supuesto un gran beneficio para los pacientes. A la vez, los exámenes de TC representan la contribución más alta a la dosis de radiación colectiva en la actualidad. La dosis que reciben los pacientes en un estudio de TC es un orden de magnitud más alta que en exámenes de radiología convencional. En relación con la dosis a pacientes en radiodiagnóstico, el criterio ALARA es aceptado universalmente. Expone que las imágenes de los pacientes deberían obtenerse utilizando una dosis tan baja como sea razonablemente posible y compatible con el objetivo diagnóstico de la prueba. Algunos casos de sobreexposición de pacientes a la radiación, la mayoría en exámenes de perfusión cerebral, se han hecho públicos, lo que ha tenido un gran impacto en los medios de comunicación de EEUU. Estos accidentes, junto con el creciente número de exámenes TC anuales, han hecho aumentar la preocupación sobre las dosis de radiación impartidas a los pacientes en TC. V arias guías y recomendaciones para la optimización de la dosis en TC han sido publicadas por distintas organizaciones, y han sido incluidas en normas europeas y nacionales y adoptadas parcialmente por los fabricantes de equipos de TC. En TC, el tubo de rayos-X rota en tomo al paciente, emitiendo fotones en haces desde distintos ángulos o proyecciones. Estos fotones interactúan con los tejidos en el paciente, en función de su energía y de la composición y densidad del tejido. Una fracción de estos fotones depositan parte o toda su energía dentro del paciente, dando lugar a la dosis absorbida en los órganos. Las imágenes se generan usando los datos de las proyecciones del haz de rayos-X que alcanzan los detectores tras atravesar al paciente. Cada proyección representa la atenuación total del haz de rayos-X integrada a lo largo de su trayectoria. Un protocolo de TC se define como una colección de opciones que pueden seleccionarse en la consola del equipo y que afectan a la calidad de las imágenes y a la dosis que recibe el paciente. Pueden ser parámetros de adquisición, tales como la colimación del haz, la intensidad de corriente, el tiempo de rotación, el kV, el factor de paso parámetros de reconstrucción como el espesor y espaciado de corte, el filtro y el método de reconstrucción (retroproyección filtrada (FBP) o algoritmos iterativos). Los principales fabricantes de equipos de TC ofrecen protocolos recomendados para distintas indicaciones, dependiendo de la región anatómica. El usuario con frecuencia fija los parámetros del protocolo eligiendo entre un rango de valores disponibles, para adaptarlo a la indicación clínica y a las características del paciente, tales como su tamaño o edad. Las condiciones seleccionadas en el protocolo tienen un gran impacto en la calidad de imagen y la dosis. Múltiples combinaciones de los parámetros pueden dar lugar a un nivel de calidad de imagen apropiado para un estudio en concreto. La optimización de los protocolos es una tarea compleja en TC, ya que la mayoría de los parámetros del protocolo están relacionados entre sí y afectan a la calidad de imagen y a la dosis que recibe el paciente...Depto. de Radiología, Rehabilitación y FisioterapiaFac. de MedicinaTRUEunpu

    Towards patient-specific dose and image quality analysis in CT imaging

    Get PDF

    Objective Assessment of Image Quality: Extension of Numerical Observer Models to Multidimensional Medical Imaging Studies

    Get PDF
    Encompassing with fields on engineering and medical image quality, this dissertation proposes a novel framework for diagnostic performance evaluation based on objective image-quality assessment, an important step in the development of new imaging devices, acquisitions, or image-processing techniques being used for clinicians and researchers. The objective of this dissertation is to develop computational modeling tools that allow comprehensive evaluation of task-based assessment including clinical interpretation of images regardless of image dimensionality. Because of advances in the development of medical imaging devices, several techniques have improved image quality where the format domain of the outcome images becomes multidimensional (e.g., 3D+time or 4D). To evaluate the performance of new imaging devices or to optimize various design parameters and algorithms, the quality measurement should be performed using an appropriate image-quality figure-of-merit (FOM). Classical FOM such as bias and variance, or mean-square error, have been broadly used in the past. Unfortunately, they do not reflect the fact that the average performance of the principal agent in medical decision-making is frequently a human observer, nor are they aware of the specific diagnostic task. The standard goal for image quality assessment is a task-based approach in which one evaluates human observer performance of a specified diagnostic task (e.g. detection of the presence of lesions). However, having a human observer performs the tasks is costly and time-consuming. To facilitate practical task-based assessment of image quality, a numerical observer is required as a surrogate for human observers. Previously, numerical observers for the detection task have been studied both in research and industry; however, little research effort has been devoted toward development of one utilized for multidimensional imaging studies (e.g., 4D). Limiting the numerical observer tools that accommodate all information embedded in a series of images, the performance assessment of a particular new technique that generates multidimensional data is complex and limited. Consequently, key questions remain unanswered about how much the image quality improved using these new multidimensional images on a specific clinical task. To address this gap, this dissertation proposes a new numerical-observer methodology to assess the improvement achieved from newly developed imaging technologies. This numerical observer approach can be generalized to exploit pertinent statistical information in multidimensional images and accurately predict the performance of a human observer over the complexity of the image domains. Part I of this dissertation aims to develop a numerical observer that accommodates multidimensional images to process correlated signal components and appropriately incorporate them into an absolute FOM. Part II of this dissertation aims to apply the model developed in Part I to selected clinical applications with multidimensional images including: 1) respiratory-gated positron emission tomography (PET) in lung cancer (3D+t), 2) kinetic parametric PET in head-and-neck cancer (3D+k), and 3) spectral computed tomography (CT) in atherosclerotic plaque (3D+e). The author compares the task-based performance of the proposed approach to that of conventional methods, evaluated based on a broadly-used signal-known-exactly /background-known-exactly paradigm, which is in the context of the specified properties of a target object (e.g., a lesion) on highly realistic and clinical backgrounds. A realistic target object is generated with specific properties and applied to a set of images to create pathological scenarios for the performance evaluation, e.g., lesions in the lungs or plaques in the artery. The regions of interest (ROIs) of the target objects are formed over an ensemble of data measurements under identical conditions and evaluated for the inclusion of useful information from different complex domains (i.e., 3D+t, 3D+k, 3D+e). This work provides an image-quality assessment metric with no dimensional limitation that could help substantially improve assessment of performance achieved from new developments in imaging that make use of high dimensional data

    A dual modality, DCE-MRI and x-ray, physical phantom for quantitative evaluation of breast imaging protocols

    Get PDF
    The current clinical standard for breast cancer screening is mammography. However, this technique has a low sensitivity which results in missed cancers. Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) has recently emerged as a promising technique for breast cancer diagnosis and has been reported as being superior to mammography for screening of high-risk women and evaluation of extent of disease. At the same time, low and variable specificity has been documented in the literature as well as a rising number of mastectomies possibly due to the increasing use of DCE-MRI. In this study, we developed and characterized a dual-modality, x-ray and DCE-MRI, anthropomorphic breast phantom for the quantitative assessment of breast imaging protocols. X-ray properties of the phantom were quantitatively compared with patient data, including attenuation coefficients, which matched human values to within the measurement error, and tissue structure using spatial covariance matrices of image data, which were found to be similar in size to patient data. Simulations of the phantom scatter-to-primary ratio (SPR) were produced and experimentally validated then compared with published SPR predictions for homogeneous phantoms. SPR values were as high as 85% in some areas and were heavily influenced by the heterogeneous tissue structure. MRI properties of the phantom, T1 and T2 relaxation values and tissue structure, were also quantitatively compared with patient data and found to match within two error bars. Finally, a dynamic lesion that mimics lesion border shape and washout curve shape was included in the phantom. High spatial and temporal resolution x-ray measurements of the washout curve shape were performed to determine the true contrast agent concentration as a function of time. DCE-MRI phantom measurements using a clinical imaging protocol were compared against the x-ray truth measurements. MRI signal intensity curves were shown to be less specific to lesion type than the x-ray derived contrast agent concentration curves. This phantom allows, for the first time, for quantitative evaluation of and direct comparisons between x-ray and MRI breast imaging modalities in the context of lesion detection and characterization
    corecore