269 research outputs found

    Self-Normalization of 3D PET Data by Estimating Scan-Dependent Effective Crystal Efficiencies

    Get PDF

    Bootstrap-Optimised Regularised Image Reconstruction for Emission Tomography

    Get PDF
    Supporting data and MATLAB code for the paper: A. J. Reader and S. Ellis, "Bootstrap-Optimised Regularised Image Reconstruction for Emission Tomography," in IEEE Transactions on Medical Imaging (2020) DOI: 10.1109/TMI.2019.2956878 Instructions for use (tested on MATLAB R2017a): - unzip the file bootstrap_optimised_PET_image_reconstruction.zip Dependencies - add the utils directory to your path before running the scripts. Figures - there is a directory for each figure, not including those figures which do not contain experimental results. Each directory contains a .m script file and a .mat data file. Running the .m file produces the figure roughly as it appears in the manuscript. Independent exploration of the data can be performed if desired. Sample code - Running the example.m file will perform example 2D reconstructions with MLEM, bootstrap optimised guided quadratic MAPEM, and bootstrap optimised unweighted quadratic MAPEM. The reconstruction code is contained in the @reconClass folder. This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) [EP/M020142/1]; and the Wellcome/EPSRC Centre for Medical Engineering [WT 203148/Z/16/Z]

    Patch-based image reconstruction for PET using prior-image derived dictionaries

    Get PDF
    This collection contains figures and reconstructed images in .mat format associated with the manuscript tiled "Patch-based image reconstruction for PET using prior-image derived dictionaries" . The file, Data_Fig9-10.zip contains the reconstructed images associated with Fig 9 and 10 as a function of iteration for different methods. Data_Fig10-12.zip contains reconstructed images of the real data for different methods

    Mass and UV-visible spectral fingerprints of dissolved organic matter: sources and reactivity

    Get PDF
    Advanced analytical techniques have revealed a high degree of complexity in the chemical makeup of dissolved organic matter (DOM). This has opened the door for a deeper understanding of the role of DOM in the aquatic environment. However, the expense, analytical cost, and challenges related to interpretation of the large datasets generated by these methods limit their widespread application. Optical methods, such as absorption and fluorescence spectroscopy are relatively inexpensive and easy to implement, but lack the detailed information available in more advanced methods. We were able to directly link the analysis of absorption spectra to the mass spectra of DOM using an in-line detector system coupled to multivariate data analysis. Monthly samples were taken from three river mouths in Sweden for one year. One subset of samples was exposed to photochemical degradation and another subset was exposed to long-term (4 months) biological degradation. A principle component analysis was performed on the coupled absorption-mass spectra data. Loading spectra for each principle component show distinct fingerprints for both reactivity (i.e. photochemical, biological degradation) and source (i.e. catchment land cover, temperature, hydrology). The fingerprints reveal mass-to-charge values that contribute to optical signals and characteristics seen in past studies, and emphasise the difficulties in interpreting changes in bulk CDOM characteristics resulting from multiple catchment processes. The approach provides a potential simple method for using optical indicators as tracers for more complex chemical processes both with regards to source material for DOM and the past reactive processing of DOM

    Impact of axial compression for the mMR simultaneous PET-MR scanner

    Get PDF

    Threshold interval indexing techniques for complicated uncertain data

    Get PDF
    Uncertain data is an increasingly prevalent topic in database research, given the advance of instruments which inherently generate uncertainty in their data. In particular, the problem of indexing uncertain data for range queries has received considerable attention. To efficiently process range queries, existing approaches mainly focus on reducing the number of disk I/Os. However, due to the inherent complexity of uncertain data, processing a range query may incur high computational cost in addition to the I/O cost. In this paper, I present a novel indexing strategy focusing on one-dimensional uncertain continuous data, called threshold interval indexing. Threshold interval indexing is able to balance I/O cost and computational cost to achieve an optimal overall query performance. A key ingredient of the proposed indexing structure is a dynamic interval tree. The dynamic interval tree is much more resistant to skew than R-trees, which are widely used in other indexing structures. This interval tree optimizes pruning by storing x-bounds, or pre-calculated probability boundaries, at each node. In addition to the basic threshold interval index, I present two variants, called the strong threshold interval index and the hyper threshold interval index, which leverage x-bounds not only for pruning but also for accepting results. Furthermore, I present a more efficient memory-loaded versions of these indexes, which reduce the storage size so the primary interval tree can be loaded into memory. Each index description includes methods for querying, parallelizing, updating, bulk loading, and externalizing. I perform an extensive set of experiments to demonstrate the effectiveness and efficiency of the proposed indexing strategies

    Multitracer Guided PET Image Reconstruction

    Get PDF
    corecore