35 research outputs found

    An overview of the fundamental approaches that yield several image denoising techniques

    Get PDF
    Digital image is considered as a powerful tool to carry and transmit information between people. Thus, it attracts the attention of large number of researchers, among them those interested in preserving the image features from any factors that may reduce the image quality. One of these factors is the noise which affects the visual aspect of the image and makes others image processing more difficult. Thus far, solving this noise problem remains a challenge for the researchers in this field. A lot of image denoising techniques have been introduced in order to remove the noise by taking care of the image features; in other words, getting the best similarity to the original image from the noisy one. However, the findings are still inconclusive. Beside the enormous amount of researches and studies which adopt several mathematical concepts (statistics, probabilities, modeling, PDEs, wavelet, fuzzy logic, etc.), there is also the scarcity of review papers which carry an important role in the development and progress of research. Thus, this review paper intorduce an overview of the different fundamental approaches that yield the several image-denoising techniques, presented with a new classification. Furthermore, the paper presents the different evaluation tools needed on the comparison between these techniques in order to facilitate the processing of this noise problem, among a great diversity of techniques and concepts

    A new approach of weighted gradient filter for denoising of medical images in the presence of Poisson noise

    Get PDF
    Predlažemo ponderirani stupnjevani filtar za otklanjanje Poissonova šuma na rendgenskim slikama. U unaprijed definiranom prozoru izračunat je gradijent središnjeg piksela. Za izračunavanje vrijednosti gradijenta primijenjen je Gaussov ponderirani filtar. Predložena metoda je primijenjena na biomedicinske rendgenske slike, a zatim na različite uobičajene slike LENE i paprika. Rezultati pokazuju učinkovitost i bolju jasnoću slika uz primjenu ponderiranog stupnjevanog filtra. Uz to, predložena metoda je računalno vrlo učinkovita i brža od Non Local Mean (NLM) filtra koji predstavlja unaprijeđenu metodu za otklanjanje Poissonova šuma. Rezultati predložene metode su također bolji u odnosu na parametre za mjerenje performanse t.j. korelacije, Peak Signal-to-Noise Ratio (PSNR), Maximum Structural Similarity Index Measure (MSSIM) i Mean Square Error (MSE) nego uobičajeni Median, Wiener i NLM filter.We propose a Weighted Gradient Filter for denoising of Poisson noise in medical images. In a predefined window, gradient of the centre pixel is averaged out. Gaussian Weighted filter is used on all calculated gradient values. Proposed method is applied on biomedical images X-Rays and then on different general images of LENA and Peppers. Recovery results show that the proposed weighted gradient filter is efficient and has better visual appearance. Moreover, proposed method is computationally very efficient and faster than Non Local Mean (NLM) filter which is an advanced technique for Poisson noise removal. Proposed method results are also better in terms of performance measures parameters i.e. correlation, Peak Signal-to-Noise Ratio (PSNR), Maximum Structural Similarity Index Measure (MSSIM) and Mean Square Error (MSE) than the conventional Median, Wiener and NLM filter

    Sparse and Redundant Representations for Inverse Problems and Recognition

    Get PDF
    Sparse and redundant representation of data enables the description of signals as linear combinations of a few atoms from a dictionary. In this dissertation, we study applications of sparse and redundant representations in inverse problems and object recognition. Furthermore, we propose two novel imaging modalities based on the recently introduced theory of Compressed Sensing (CS). This dissertation consists of four major parts. In the first part of the dissertation, we study a new type of deconvolution algorithm that is based on estimating the image from a shearlet decomposition. Shearlets provide a multi-directional and multi-scale decomposition that has been mathematically shown to represent distributed discontinuities such as edges better than traditional wavelets. We develop a deconvolution algorithm that allows for the approximation inversion operator to be controlled on a multi-scale and multi-directional basis. Furthermore, we develop a method for the automatic determination of the threshold values for the noise shrinkage for each scale and direction without explicit knowledge of the noise variance using a generalized cross validation method. In the second part of the dissertation, we study a reconstruction method that recovers highly undersampled images assumed to have a sparse representation in a gradient domain by using partial measurement samples that are collected in the Fourier domain. Our method makes use of a robust generalized Poisson solver that greatly aids in achieving a significantly improved performance over similar proposed methods. We will demonstrate by experiments that this new technique is more flexible to work with either random or restricted sampling scenarios better than its competitors. In the third part of the dissertation, we introduce a novel Synthetic Aperture Radar (SAR) imaging modality which can provide a high resolution map of the spatial distribution of targets and terrain using a significantly reduced number of needed transmitted and/or received electromagnetic waveforms. We demonstrate that this new imaging scheme, requires no new hardware components and allows the aperture to be compressed. Also, it presents many new applications and advantages which include strong resistance to countermesasures and interception, imaging much wider swaths and reduced on-board storage requirements. The last part of the dissertation deals with object recognition based on learning dictionaries for simultaneous sparse signal approximations and feature extraction. A dictionary is learned for each object class based on given training examples which minimize the representation error with a sparseness constraint. A novel test image is then projected onto the span of the atoms in each learned dictionary. The residual vectors along with the coefficients are then used for recognition. Applications to illumination robust face recognition and automatic target recognition are presented

    Computer-Assisted Algorithms for Ultrasound Imaging Systems

    Get PDF
    Ultrasound imaging works on the principle of transmitting ultrasound waves into the body and reconstructs the images of internal organs based on the strength of the echoes. Ultrasound imaging is considered to be safer, economical and can image the organs in real-time, which makes it widely used diagnostic imaging modality in health-care. Ultrasound imaging covers the broad spectrum of medical diagnostics; these include diagnosis of kidney, liver, pancreas, fetal monitoring, etc. Currently, the diagnosis through ultrasound scanning is clinic-centered, and the patients who are in need of ultrasound scanning has to visit the hospitals for getting the diagnosis. The services of an ultrasound system are constrained to hospitals and did not translate to its potential in remote health-care and point-of-care diagnostics due to its high form factor, shortage of sonographers, low signal to noise ratio, high diagnostic subjectivity, etc. In this thesis, we address these issues with an objective of making ultrasound imaging more reliable to use in point-of-care and remote health-care applications. To achieve the goal, we propose (i) computer-assisted algorithms to improve diagnostic accuracy and assist semi-skilled persons in scanning, (ii) speckle suppression algorithms to improve the diagnostic quality of ultrasound image, (iii) a reliable telesonography framework to address the shortage of sonographers, and (iv) a programmable portable ultrasound scanner to operate in point-of-care and remote health-care applications

    Introduction to the Restoration of Astrophysical Images by Multiscale Transforms and Bayesian Methods

    Get PDF
    This book is a collection of 19 articles which reflect the courses given at the Collège de France/Summer school “Reconstruction d'images − Applications astrophysiques“ held in Nice and Fréjus, France, from June 18 to 22, 2012. The articles presented in this volume address emerging concepts and methods that are useful in the complex process of improving our knowledge of the celestial objects, including Earth

    Improved Wavelet Threshold for Image De-noising

    Get PDF
    With the development of communication technology and network technology, as well as the rising popularity of digital electronic products, an image has become an important carrier of access to outside information. However, images are vulnerable to noise interference during collection, transmission and storage, thereby decreasing image quality. Therefore, image noise reduction processing is necessary to obtain higher-quality images. For the characteristics of its multi-analysis, relativity removal, low entropy, and flexible bases, the wavelet transform has become a powerful tool in the field of image de-noising. The wavelet transform in application mathematics has a rapid development. De-noising methods based on wavelet transform is proposed and achieved with good results, but shortcomings still remain. Traditional threshold functions have some deficiencies in image de-noising. A hard threshold function is discontinuous, whereas a soft threshold function causes constant deviation. To address these shortcomings, a method for removing image noise is proposed in this paper. First, the method decomposes the noise image to determine the wavelet coefficients. Second, the wavelet coefficient is applied on the high-frequency part of the threshold processing by using the improved threshold function. Finally, the de-noised images are obtained to rebuild the images in accordance with the estimation in the wavelet-based conditions. Experiment results show that this method, discussed in this paper, is better than traditional hard threshold de-noising and soft threshold de-noising methods, in terms of objective effects and subjective visual effects

    Review : Deep learning in electron microscopy

    Get PDF
    Deep learning is transforming most areas of science and technology, including electron microscopy. This review paper offers a practical perspective aimed at developers with limited familiarity. For context, we review popular applications of deep learning in electron microscopy. Following, we discuss hardware and software needed to get started with deep learning and interface with electron microscopes. We then review neural network components, popular architectures, and their optimization. Finally, we discuss future directions of deep learning in electron microscopy

    All-sky radiative transfer and characterisation for cosmic structures

    Get PDF
    This thesis focuses on providing a solid theoretical foundation and the associated methodologies for the studies of cosmic magnetism and cosmological reionisation. It develops covariant formalisms of cosmological radiative transport of (i) polarised continuum radiation, and (ii) 21-cm line of neutral hydrogen that calculate, from first principles, the polarisation arising from the emergence and evolution of cosmic magnetic fields and the tomographic 21-cm line signals associated with cosmological reionisation, respectively. The two formalisms, namely the cosmological polarised radiative transfer (CPRT) and the cosmological 21-cm line radiative transfer (C21LRT), self-consistently account for the relevant radiation processes, relativistic and cosmological effects along a ray transported in an expanding, evolving Universe. Their all-sky algorithms adopt a ray-tracing method and a post-processing approach by which complex physical models, such as those obtained from cosmological simulations, can be accounted for in the radiative transfer calculations. The power of the CPRT calculations to compute unambiguous point-to-point polarisation of large-scale structures, such as a 3D simulated galaxy cluster and a modelled magnetised universe, is demonstrated. The ability of the C21LRT formulation to calculate the 21-cm line spectra across cosmic time, with full accounts of the essential cosmological radiative transfer effects, is verified. Furthermore, a new spherical curvelet transform for efficient extraction of directional, elongated features within spherical data is constructed. It is particularly useful for the studies in wide-field astronomical research, such as analyses of the data of continuum polarisation and the structured 21-cm line from all-sky surveys or the CPRT and C21LRT calculations. The formulations, methodologies and techniques developed in this work together establish a solid framework within which reliable theoretical predictions and robust data characterisation can be made, ultimately laying a foundation for the meaningful physical interpretation of observations and studying the structural evolution of the magnetic ionised Universe
    corecore