760 research outputs found

    Astronomical image manipulation in the transform domain

    Full text link
    It is well known that images are usually stored and transmitted in the compressed form to save memory space and I/O bandwidth. Among many image compression schemes, transform coding is a widely used coding method. Traditionally, processing a compressed image requires decompression first. Following manipulations, the processed image is compressed again for storage. To reduce the computational complexity and processing time, manipulating images in the semi-compressed or transform domain is an efficient solution; Many astronomical images are compressed and stored by JPEG and HCOM-PRESS, which are based on the Discrete Cosine Transform (DCT) and the Discrete Wavelet Transform (DWT), respectively. In this thesis, a suite of image processing algorithms in the transform domain, DCT and DWT, is developed. In particular, new methods for edge enhancement and minimum (MIN)/maximum (MAX) gray scale intensity estimation in the DCT domain are proposed. Algebraic operations and image interpolation in the DWT domain are addressed. The superiority of new algorithms over the conventional ones is demonstrated by comparing the time complexities and qualities of the processed image in the transform domain to those in the spatial domain

    Image interpolation and denoising in discrete wavelet transform domain

    Full text link
    Traditionally, processing a compressed image requires decompression first. Following the related manipulations, the processed image is compressed again for storage. To reduce the computational complexity and processing time, manipulating images in the transform domain, which is possible, is an efficient solution; The uniform wavelet thresholding is one of the most widely used methods for image denoising in the Discrete Wavelet Transform (DWT) domain. This method, however, has the drawback of blurring the edges and the textures of an image after denoising. A new algorithm is proposed in this thesis for image denoising in the DWT domain with no blurring effect. This algorithm uses a suite of feature extraction and image segmentation techniques to construct filter masks for denoising. The novelty of the algorithm is that it directly extracts the edges and texture details of an image from the spatial information contained in the LL subband of DWT domain rather than detecting the edges across multiple scales. An added advantage of this method is the substantial reduction in computational complexity. Experimental results indicate that the new algorithm would yield higher quality images (both qualitatively and quantitatively) than the existing methods; In this thesis, new algorithm for image interpolation in the DWT domain is also discussed. Being different from other methods for interpolation, which focus on Haar wavelet, new interpolation algorithm also investigates other wavelets, such as Daubecuies and Bior. Experimental results indicate that the new algorithm is superior to the traditional methods by comparing the time complexity and quality of the processed image

    Finding faint HI structure in and around galaxies: scraping the barrel

    Get PDF
    Soon to be operational HI survey instruments such as APERTIF and ASKAP will produce large datasets. These surveys will provide information about the HI in and around hundreds of galaxies with a typical signal-to-noise ratio of ∼\sim 10 in the inner regions and ∼\sim 1 in the outer regions. In addition, such surveys will make it possible to probe faint HI structures, typically located in the vicinity of galaxies, such as extra-planar-gas, tails and filaments. These structures are crucial for understanding galaxy evolution, particularly when they are studied in relation to the local environment. Our aim is to find optimized kernels for the discovery of faint and morphologically complex HI structures. Therefore, using HI data from a variety of galaxies, we explore state-of-the-art filtering algorithms. We show that the intensity-driven gradient filter, due to its adaptive characteristics, is the optimal choice. In fact, this filter requires only minimal tuning of the input parameters to enhance the signal-to-noise ratio of faint components. In addition, it does not degrade the resolution of the high signal-to-noise component of a source. The filtering process must be fast and be embedded in an interactive visualization tool in order to support fast inspection of a large number of sources. To achieve such interactive exploration, we implemented a multi-core CPU (OpenMP) and a GPU (OpenGL) version of this filter in a 3D visualization environment (SlicerAstro\tt{SlicerAstro}).Comment: 17 pages, 9 figures, 4 tables. Astronomy and Computing, accepte

    Robust sparse image reconstruction of radio interferometric observations with purify

    Get PDF
    Next-generation radio interferometers, such as the Square Kilometre Array (SKA), will revolutionise our understanding of the universe through their unprecedented sensitivity and resolution. However, to realise these goals significant challenges in image and data processing need to be overcome. The standard methods in radio interferometry for reconstructing images, such as CLEAN, have served the community well over the last few decades and have survived largely because they are pragmatic. However, they produce reconstructed inter\-ferometric images that are limited in quality and scalability for big data. In this work we apply and evaluate alternative interferometric reconstruction methods that make use of state-of-the-art sparse image reconstruction algorithms motivated by compressive sensing, which have been implemented in the PURIFY software package. In particular, we implement and apply the proximal alternating direction method of multipliers (P-ADMM) algorithm presented in a recent article. First, we assess the impact of the interpolation kernel used to perform gridding and degridding on sparse image reconstruction. We find that the Kaiser-Bessel interpolation kernel performs as well as prolate spheroidal wave functions, while providing a computational saving and an analytic form. Second, we apply PURIFY to real interferometric observations from the Very Large Array (VLA) and the Australia Telescope Compact Array (ATCA) and find images recovered by PURIFY are higher quality than those recovered by CLEAN. Third, we discuss how PURIFY reconstructions exhibit additional advantages over those recovered by CLEAN. The latest version of PURIFY, with developments presented in this work, is made publicly available.Comment: 22 pages, 10 figures, PURIFY code available at http://basp-group.github.io/purif

    Techniques for enhancing digital images

    Get PDF
    The images obtain from either research studies or optical instruments are often corrupted with noise. Image denoising involves the manipulation of image data to produce a visually high quality image. This thesis reviews the existing denoising algorithms and the filtering approaches available for enhancing images and/or data transmission. Spatial-domain and Transform-domain digital image filtering algorithms have been used in the past to suppress different noise models. The different noise models can be either additive or multiplicative. Selection of the denoising algorithm is application dependent. It is necessary to have knowledge about the noise present in the image so as to select the appropriated denoising algorithm. Noise models may include Gaussian noise, Salt and Pepper noise, Speckle noise and Brownian noise. The Wavelet Transform is similar to the Fourier transform with a completely different merit function. The main difference between Wavelet transform and Fourier transform is that, in the Wavelet Transform, Wavelets are localized in both time and frequency. In the standard Fourier Transform, Wavelets are only localized in frequency. Wavelet analysis consists of breaking up the signal into shifted and scales versions of the original (or mother) Wavelet. The Wiener Filter (mean squared estimation error) finds implementations as a LMS filter (least mean squares), RLS filter (recursive least squares), or Kalman filter. Quantitative measure (metrics) of the comparison of the denoising algorithms is provided by calculating the Peak Signal to Noise Ratio (PSNR), the Mean Square Error (MSE) value and the Mean Absolute Error (MAE) evaluation factors. A combination of metrics including the PSNR, MSE, and MAE are often required to clearly assess the model performance

    Multi-scale and Multi-directional VLBI Imaging with CLEAN

    Full text link
    Very long baseline interferometry (VLBI) is a radio-astronomical technique in which the correlated signal from various baselines is combined into an image of highest angular resolution. Due to sparsity of the measurements, this imaging procedure constitutes an ill-posed inverse problem. For decades the CLEAN algorithm was the standard choice in VLBI studies, although having some serious disadvantages and pathologies that are challenged by the requirements of modern frontline VLBI applications. We develop a novel multi-scale CLEAN deconvolution method (DoB-CLEAN) based on continuous wavelet transforms that address several pathologies in CLEAN imaging. We benchmark this novel algorithm against CLEAN reconstructions on synthetic data and reanalyze BL Lac observations of RadioAstron with DoB-CLEAN. DoB-CLEAN approaches the image by multi-scalar and multi-directional wavelet dictionaries. Two different dictionaries are used. Firstly, a difference of elliptical spherical Bessel functions dictionary fitted to the uv-coverage of the observation that is used to sparsely represent the features in the dirty image. Secondly, a difference of elliptical Gaussian wavelet dictionary that is well suited to represent relevant image features cleanly. The deconvolution is performed by switching between the dictionaries. DoB-CLEAN achieves super-resolution compared to CLEAN and remedies the spurious regularization properties of CLEAN. In contrast to CLEAN, the representation by basis functions has a physical meaning. Hence, the computed deconvolved image still fits the observed visibilities, opposed to CLEAN. State-of-the-art multi-scalar imaging approaches seem to outperform single-scalar standard approaches in VLBI and are well suited to maximize the extraction of information in ongoing frontline VLBI applications.Comment: Accepted for publication in A&

    WAVELET PACKET POWER SPECTRUM OF THE SDSS LYMAN-ALPHA FOREST: A TOOL FOR LARGE-SCALE STRUCTURE DETECTION

    Get PDF
    One of the goals of astrophysics is to obtain a full understanding how the Universe is organized on large scales and how structure evolved. In this thesis we develop a method of detecting structure on Mpc scales by measuring the one-dimensional power spectrum of the transmitted ux in the Lyman- forest. The method is based on the wavelet packet transform (WPT), which has several advantages over the Fourier transform. This includes reduced noise, resulting in less data manipulation and scrubbing in the early stages of analysis. Another advantage is localization of outliers in the data, which allows the general trend of the power spectrum to be revealed despite potentially problematic data. We apply the method to the set of 54,468 quasar spectra from the third collaboration of the Sloan Digital Sky Survey (SDSS-III) Baryonic Oscillation Spectroscopic Survey (BOSS) data release 9 (DR9) catalog. This is intended to be a proof of concept to determine if the wavelet packet power spectrum is a valid technique to extract the power spectrum in order to detect matter density uctuations. Results are in good agreement with previous studies that used conventional Fourier techniques. The power spectrum vs velocity space plots show increasing power at smaller scales for both our results and earlier studies by [21] and [6]. We conclude that the wavelet packet power spectrum is a tool for detecting structure from transmitted ux in the Lyman- forest. The advantages the wavelet packet power spectrum over the Fourier transform method are it requires less data manipulation and minimizes noise and propagation of errors and outliers in the data. As a next step we propose applying the tool to the larger more recent SDSS IV eBOSS dataset

    A study of wavelet-based noise reduction techniques in mammograms

    Get PDF
    Breast cancer is one of the most common cancers and claims over one thousand lives every day. Breast cancer turns fatal only when diagnosed in late stages, but can be cured when diagnosed in its early stages. Over the last two decades, Digital Mammography has served the diagnosis of breast cancer. It is a very powerful aid for early detection of breast cancer. However, the images produced by mammography typically contain a great amount noise from the inherent characteristics of the imaging system and the radiation involved. Shot noise or quantum noise is the most significant noise which emerges as a result of uneven distribution of incident photons on the receptor. The X-ray dose given to patients must be minimized because of the risk of exposure. This noise present in mammograms manifests itself more when the dose of X-ray radiation is less and therefore needs to be treated before enhancing the mammogram for contrast and clarity. Several approaches have been taken to reduce the amount of noise in mammograms. This thesis presents a study of the wavelet-based techniques employed for noise reduction in mammograms --Abstract, page iii
    • …
    corecore