2,007 research outputs found

    Topics in image reconstruction for high resolution positron emission tomography

    Get PDF
    Les problèmes mal posés représentent un sujet d'intérêt interdisciplinaire qui surgires dans la télédétection et des applications d'imagerie. Cependant, il subsiste des questions cruciales pour l'application réussie de la théorie à une modalité d'imagerie. La tomographie d'émission par positron (TEP) est une technique d'imagerie non-invasive qui permet d'évaluer des processus biochimiques se déroulant à l'intérieur d'organismes in vivo. La TEP est un outil avantageux pour la recherche sur la physiologie normale chez l'humain ou l'animal, pour le diagnostic et le suivi thérapeutique du cancer, et l'étude des pathologies dans le coeur et dans le cerveau. La TEP partage plusieurs similarités avec d'autres modalités d'imagerie tomographiques, mais pour exploiter pleinement sa capacité à extraire le maximum d'information à partir des projections, la TEP doit utiliser des algorithmes de reconstruction d'images à la fois sophistiquée et pratiques. Plusieurs aspects de la reconstruction d'images TEP ont été explorés dans le présent travail. Les contributions suivantes sont d'objet de ce travail: Un modèle viable de la matrice de transition du système a été élaboré, utilisant la fonction de réponse analytique des détecteurs basée sur l'atténuation linéaire des rayons y dans un banc de détecteur. Nous avons aussi démontré que l'utilisation d'un modèle simplifié pour le calcul de la matrice du système conduit à des artefacts dans l'image. (IEEE Trans. Nucl. Sei., 2000) );> La modélisation analytique de la dépendance décrite à l'égard de la statistique des images a simplifié l'utilisation de la règle d'arrêt par contre-vérification (CV) et a permis d'accélérer la reconstruction statistique itérative. Cette règle peut être utilisée au lieu du procédé CV original pour des projections aux taux de comptage élevés, lorsque la règle CV produit des images raisonnablement précises. (IEEE Trans. Nucl. Sei., 2001) Nous avons proposé une méthodologie de régularisation utilisant la décomposition en valeur propre (DVP) de la matrice du système basée sur l'analyse de la résolution spatiale. L'analyse des caractéristiques du spectre de valeurs propres nous a permis d'identifier la relation qui existe entre le niveau optimal de troncation du spectre pour la reconstruction DVP et la résolution optimale dans l'image reconstruite. (IEEE Trans. Nucl. Sei., 2001) Nous avons proposé une nouvelle technique linéaire de reconstruction d'image événement-par-événement basée sur la matrice pseudo-inverse régularisée du système. L'algorithme représente une façon rapide de mettre à jour une image, potentiellement en temps réel, et permet, en principe, la visualisation instantanée de distribution de la radioactivité durant l'acquisition des données tomographiques. L'image ainsi calculée est la solution minimisant les moindres carrés du problème inverse régularisé.Abstract: Ill-posed problems are a topic of an interdisciplinary interest arising in remote sensing and non-invasive imaging. However, there are issues crucial for successful application of the theory to a given imaging modality. Positron emission tomography (PET) is a non-invasive imaging technique that allows assessing biochemical processes taking place in an organism in vivo. PET is a valuable tool in investigation of normal human or animal physiology, diagnosing and staging cancer, heart and brain disorders. PET is similar to other tomographie imaging techniques in many ways, but to reach its full potential and to extract maximum information from projection data, PET has to use accurate, yet practical, image reconstruction algorithms. Several topics related to PET image reconstruction have been explored in the present dissertation. The following contributions have been made: (1) A system matrix model has been developed using an analytic detector response function based on linear attenuation of [gamma]-rays in a detector array. It has been demonstrated that the use of an oversimplified system model for the computation of a system matrix results in image artefacts. (IEEE Trans. Nucl. Sci., 2000); (2) The dependence on total counts modelled analytically was used to simplify utilisation of the cross-validation (CV) stopping rule and accelerate statistical iterative reconstruction. It can be utilised instead of the original CV procedure for high-count projection data, when the CV yields reasonably accurate images. (IEEE Trans. Nucl. Sci., 2001); (3) A regularisation methodology employing singular value decomposition (SVD) of the system matrix was proposed based on the spatial resolution analysis. A characteristic property of the singular value spectrum shape was found that revealed a relationship between the optimal truncation level to be used with the truncated SVD reconstruction and the optimal reconstructed image resolution. (IEEE Trans. Nucl. Sci., 2001); (4) A novel event-by-event linear image reconstruction technique based on a regularised pseudo-inverse of the system matrix was proposed. The algorithm provides a fast way to update an image potentially in real time and allows, in principle, for the instant visualisation of the radioactivity distribution while the object is still being scanned. The computed image estimate is the minimum-norm least-squares solution of the regularised inverse problem

    Accelerating Permutation Testing in Voxel-wise Analysis through Subspace Tracking: A new plugin for SnPM

    Get PDF
    Permutation testing is a non-parametric method for obtaining the max null distribution used to compute corrected pp-values that provide strong control of false positives. In neuroimaging, however, the computational burden of running such an algorithm can be significant. We find that by viewing the permutation testing procedure as the construction of a very large permutation testing matrix, TT, one can exploit structural properties derived from the data and the test statistics to reduce the runtime under certain conditions. In particular, we see that TT is low-rank plus a low-variance residual. This makes TT a good candidate for low-rank matrix completion, where only a very small number of entries of TT (∼0.35%\sim0.35\% of all entries in our experiments) have to be computed to obtain a good estimate. Based on this observation, we present RapidPT, an algorithm that efficiently recovers the max null distribution commonly obtained through regular permutation testing in voxel-wise analysis. We present an extensive validation on a synthetic dataset and four varying sized datasets against two baselines: Statistical NonParametric Mapping (SnPM13) and a standard permutation testing implementation (referred as NaivePT). We find that RapidPT achieves its best runtime performance on medium sized datasets (50≤n≤20050 \leq n \leq 200), with speedups of 1.5x - 38x (vs. SnPM13) and 20x-1000x (vs. NaivePT). For larger datasets (n≥200n \geq 200) RapidPT outperforms NaivePT (6x - 200x) on all datasets, and provides large speedups over SnPM13 when more than 10000 permutations (2x - 15x) are needed. The implementation is a standalone toolbox and also integrated within SnPM13, able to leverage multi-core architectures when available.Comment: 36 pages, 16 figure

    PERICLES Deliverable 4.3:Content Semantics and Use Context Analysis Techniques

    Get PDF
    The current deliverable summarises the work conducted within task T4.3 of WP4, focusing on the extraction and the subsequent analysis of semantic information from digital content, which is imperative for its preservability. More specifically, the deliverable defines content semantic information from a visual and textual perspective, explains how this information can be exploited in long-term digital preservation and proposes novel approaches for extracting this information in a scalable manner. Additionally, the deliverable discusses novel techniques for retrieving and analysing the context of use of digital objects. Although this topic has not been extensively studied by existing literature, we believe use context is vital in augmenting the semantic information and maintaining the usability and preservability of the digital objects, as well as their ability to be accurately interpreted as initially intended.PERICLE
    • …
    corecore