12 research outputs found

    Improving the angular resolution of coded aperture instruments using a modified Lucy-Richardson algorithm for deconvolution

    Get PDF
    A problem with coded-mask telescopes is the achievable angular resolution. For example, with the standard cross-correlation (CC) analysis, the INTEGRAL IBIS/ISGRI angular resolution is about 13'. We are currently investigating an iterative Lucy-Richardson (LR) algorithm. The LR algorithm can be used effectively when the PSF is known, but little or no information is available for the noise. This algorithm maximizes the probability of the restored image, under the assumption that the noise is Poisson distributed, which is appropriate for photon noise in the data, and converges to the maximum likelihood solution. We have modified the classical LR algorithm, adding non-negative constraints. It doesn't take into account of the features leading to a difference in PSF depending on position in the field of view (dead pixels, gaps between modules etc), which are easily corrected for in the classical CC analysis, so we must correct for these either after the restoration of the image or by modifing the data before the sky reconstruction. We present some results using real IBIS data indicating the power of the proposed reconstruction algorithm

    The ROI CT problem: a shearlet-based regularization approach

    Get PDF
    The possibility to significantly reduce the X-ray radiation dose and shorten the scanning time is particularly appealing, especially for the medical imaging community. Region- of-interest Computed Tomography (ROI CT) has this potential and, for this reason, is currently receiving increasing attention. Due to the truncation of projection images, ROI CT is a rather challenging problem. Indeed, the ROI reconstruction problem is severely ill-posed in general and naive local reconstruction algorithms tend to be very unstable. To obtain a stable and reliable reconstruction, under suitable noise circumstances, we formulate the ROI CT problem as a convex optimization problem with a regularization term based on shearlets, and possibly nonsmooth. For the solution, we propose and analyze an iterative approach based on the variable metric inexact line-search algorithm (VMILA). The reconstruction performance of VMILA is compared against different regularization conditions, in the case of fan-beam CT simulated data. The numerical tests show that our approach is insensitive to the location of the ROI and remains very stable also when the ROI size is rather small

    Improving the angular resolution of coded aperture instruments using a modified Lucy-Richardson algorithm for deconvolution

    No full text
    A problem with coded-mask telescopes is the achievable angular resolution. For example, with the standard cross-correlation (CC) analysis, the INTEGRAL IBIS/ISGRI angular resolution is about 13’. We are currently investigating an iterative Lucy-Richardson (LR) algorithm. The LR algorithm can be used effectively when the PSF is known, but little or no information is available for the noise. This algorithm maximizes the probability of the restored image, under the assumption that the noise is Poisson distributed, which is appropriate for photon noise in the data, and converges to the maximum likelihood solution. We have modified the classical LR algorithm, adding non-negative constraints. It doesn’t take into account of the features leading to a difference in PSF depending on position in the field of view (dead pixels, gaps between modules etc), which are easily corrected for in the classical CC analysis, so we must correct for these either after the restoration of the image or by modifing the data before the sky reconstruction. We present some results using real IBIS data indicating the power of the proposed reconstruction algorithm

    Data Mining Tools: From Web to Grid Architectures

    No full text
    The paradigm of Grid computing is establishing as a novel, reliable and effective method to exploit a pool of hardware resources and make them available to the users. Data-mining benefits from the Grid as it often requires to run time consuming algorithms on large amounts of data which maybe reside on a different resource from the one having the proper data-mining algorithms. Also, in recent times, machine learning methods have been available to the purposes of knowledge discovery, which is a topic of interest for a large community of users. The present work is an account of the evolution of the ways in which a user can be provided with a data-mining service: from a web interface to a Grid service, the exploitation of a complex resource from a technical and a user-friendliness point of view is considered. More specifically, the goal is to show the interest/advantage of running data mining algorithm on the Grid. Such an environment can employ computational and storage resources in an efficient way, making it possible to open data mining services to Grid users and providing services to business contexts
    corecore