4,563 research outputs found

    Tests of the photometric accuracy of image restoration using the maximum entropy algorithm

    Get PDF
    Simulations are described which test the maximum entropy image restoration algorithm as implemented in the MEMESYS3 code (version 2) of S. F. Gull and J. Skilling [Quantified Maximum Entropy, MEMSYS3 Users’ Manual, version 2.0 (1989)]. It is found that at the faintest brightness levels, while the code can recover blurred point sources, they are recovered systematically too faint. The size of the photometric error depends on the brightness of the source (relative to the noise of the background) and the crowding of the field. The error increases substantially as the crowding increases. At present, the optimum technique to apply to blurred images of crowded fields where most of the sources are point sources seems to be to use a restored image to generate the list of objects, then feed this into a standard point-spread function-fitting code (such as DAOPHOT) and use this on the original blurred frame. In that manner, the most crowded fields can be analyzed without losing photometric accuracy. Additional simulations were carried out for images with the actual point spread function of the Wide Field Camera of the Hubble Space Telescope. For single isolated sources seen against a background characterized by Gaussian noise, the detection limit is degraded near its faint limit by a factor of about 4 compared to that expected. But reliable photometry cannot be obtained for sources at the detection limit in either the simulated frames, or those frames passed through the MEMSYS3 image restoration code. If one requires photometry accurate to 10%, the performance of the as-built HST plus Wide Field Camera is degraded near its faint limit by a factor of between 10 and 15 compared to that expected, even for isolated point sources

    Mammographic image restoration using maximum entropy deconvolution

    Get PDF
    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization.Comment: 18 pages, 10 figure

    Extended object reconstruction in adaptive-optics imaging: the multiresolution approach

    Full text link
    We propose the application of multiresolution transforms, such as wavelets (WT) and curvelets (CT), to the reconstruction of images of extended objects that have been acquired with adaptive optics (AO) systems. Such multichannel approaches normally make use of probabilistic tools in order to distinguish significant structures from noise and reconstruction residuals. Furthermore, we aim to check the historical assumption that image-reconstruction algorithms using static PSFs are not suitable for AO imaging. We convolve an image of Saturn taken with the Hubble Space Telescope (HST) with AO PSFs from the 5-m Hale telescope at the Palomar Observatory and add both shot and readout noise. Subsequently, we apply different approaches to the blurred and noisy data in order to recover the original object. The approaches include multi-frame blind deconvolution (with the algorithm IDAC), myopic deconvolution with regularization (with MISTRAL) and wavelets- or curvelets-based static PSF deconvolution (AWMLE and ACMLE algorithms). We used the mean squared error (MSE) and the structural similarity index (SSIM) to compare the results. We discuss the strengths and weaknesses of the two metrics. We found that CT produces better results than WT, as measured in terms of MSE and SSIM. Multichannel deconvolution with a static PSF produces results which are generally better than the results obtained with the myopic/blind approaches (for the images we tested) thus showing that the ability of a method to suppress the noise and to track the underlying iterative process is just as critical as the capability of the myopic/blind approaches to update the PSF.Comment: In revision in Astronomy & Astrophysics. 19 pages, 13 figure

    Image restoration using the Q-Ising spin glass

    Get PDF
    We investigate static and dynamic properties of gray-scale image restoration (GSIR) by making use of the Q-Ising spin glass model, whose ladder symmetry allows to take in account the distance between two spins. We thus give an explicit expression of the Hamming distance between the original and restored images as a function of the hyper-parameters in the mean field limit. Finally, numerical simulations for real-world pictures are carried out to prove the efficiency of our model.Comment: 27pages, 13figures, revte

    Back to the Future: Economic Self-Organisation and Maximum Entropy Prediction

    Get PDF
    This paper shows that signal restoration methodology is appropriate for predicting the equilibrium state of certain economic systems. A formal justification for this is provided by proving the existence of finite improvement paths in object allocation problems under weak assumptions on preferences, linking any initial condition to a Nash equilibrium. Because a finite improvement path is made up of a sequence of systematic best-responses, backwards movement from the equilibrium back to the initial condition can be treated like the realisation of a noise process. This underpins the use of signal restoration to predict the equilibrium from the initial condition, and an illustration is provided through an application of maximum entropy signal restoration to the Schelling model of segregation
    • 

    corecore