748 research outputs found

    Binary sampling ghost imaging: add random noise to fight quantization caused image quality decline

    Full text link
    When the sampling data of ghost imaging is recorded with less bits, i.e., experiencing quantization, decline of image quality is observed. The less bits used, the worse image one gets. Dithering, which adds suitable random noise to the raw data before quantization, is proved to be capable of compensating image quality decline effectively, even for the extreme binary sampling case. A brief explanation and parameter optimization of dithering are given.Comment: 8 pages, 7 figure

    Digital enhancement of multispectral MSS data for maximum image visibility

    Get PDF
    A systematic approach to the enhancement of images has been developed. This approach exploits two principal features involved in the observation of images: the properties of human vision and the statistics of the images being observed. The rationale of the enhancement procedure is as follows: in the observation of some features of interest in an image, the range of objective luminance-chrominance values being displayed is generally limited and does not use the whole perceptual range of vision of the observer. The purpose of the enhancement technique is to expand and distort in a systematic way the grey scale values of each of the multispectral bands making up a color composite, to enhance the average visibility of the features being observed

    Blockwise Transform Image Coding Enhancement and Edge Detection

    Get PDF
    The goal of this thesis is high quality image coding, enhancement and edge detection. A unified approach using novel fast transforms is developed to achieve all three objectives. Requirements are low bit rate, low complexity of implementation and parallel processing. The last requirement is achieved by processing the image in small blocks such that all blocks can be processed simultaneously. This is similar to biological vision. A major issue is to minimize the resulting block effects. This is done by using proper transforms and possibly an overlap-save technique. The bit rate in image coding is minimized by developing new results in optimal adaptive multistage transform coding. Newly developed fast trigonometric transforms are also utilized and compared for transform coding, image enhancement and edge detection. Both image enhancement and edge detection involve generalised bandpass filtering wit fast transforms. The algorithms have been developed with special attention to the properties of biological vision systems

    Noise-Enhanced Information Systems

    Get PDF
    Noise, traditionally defined as an unwanted signal or disturbance, has been shown to play an important constructive role in many information processing systems and algorithms. This noise enhancement has been observed and employed in many physical, biological, and engineered systems. Indeed stochastic facilitation (SF) has been found critical for certain biological information functions such as detection of weak, subthreshold stimuli or suprathreshold signals through both experimental verification and analytical model simulations. In this paper, we present a systematic noise-enhanced information processing framework to analyze and optimize the performance of engineered systems. System performance is evaluated not only in terms of signal-to-noise ratio but also in terms of other more relevant metrics such as probability of error for signal detection or mean square error for parameter estimation. As an important new instance of SF, we also discuss the constructive effect of noise in associative memory recall. Potential enhancement of image processing systems via the addition of noise is discussed with important applications in biomedical image enhancement, image denoising, and classification

    Study and simulation of low rate video coding schemes

    Get PDF
    The semiannual report is included. Topics covered include communication, information science, data compression, remote sensing, color mapped images, robust coding scheme for packet video, recursively indexed differential pulse code modulation, image compression technique for use on token ring networks, and joint source/channel coder design

    Picture coding in viewdata systems

    Get PDF
    Viewdata systems in commercial use at present offer the facility for transmitting alphanumeric text and graphic displays via the public switched telephone network. An enhancement to the system would be to transmit true video images instead of graphics. Such a system, under development in Britain at present uses Differential Pulse Code Modulation (DPCM) and a transmission rate of 1200 bits/sec. Error protection is achieved by the use of error protection codes, which increases the channel requirement. In this thesis, error detection and correction of DPCM coded video signals without the use of channel error protection is studied. The scheme operates entirely at the receiver by examining the local statistics of the received data to determine the presence of errors. Error correction is then undertaken by interpolation from adjacent correct or previousiy corrected data. DPCM coding of pictures has the inherent disadvantage of a slow build-up of the displayed picture at the receiver and difficulties with image size manipulation. In order to fit the pictorial information into a viewdata page, its size has to be reduced. Unitary transforms, typically the discrete Fourier transform (DFT), the discrete cosine transform (DCT) and the Hadamard transform (HT) enable lowpass filtering and decimation to be carried out in a single operation in the transform domain. Size reductions of different orders are considered and the merits of the DFT, DCT and HT are investigated. With limited channel capacity, it is desirable to remove the redundancy present in the source picture in order to reduce the bit rate. Orthogonal transformation decorrelates the spatial sample distribution and packs most of the image energy in the low order coefficients. This property is exploited in bit-reduction schemes which are adaptive to the local statistics of the different source pictures used. In some cases, bit rates of less than 1.0 bit/pel are achieved with satisfactory received picture quality. Unlike DPCM systems, transform coding has the advantage of being able to display rapidly a picture of low resolution by initial inverse transformation of the low order coefficients only. Picture resolution is then progressively built up as more coefficients are received and decoded. Different sequences of picture update are investigated to find that which achieves the best subjective quality with the fewest possible coefficients transmitted

    Implementation issues in source coding

    Get PDF
    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated

    Geologic studies of Yellowstone National Park imagery using an electronic image enhancement system

    Get PDF
    The image enhancement system is described, as well as the kinds of enhancement attained. Results were obtained from various kinds of remote sensing imagery (mainly black and white multiband, color, color infrared, thermal infrared, and side-looking K-band radar) of parts of Yellowstone National Park. Possible additional fields of application of these techniques are considered
    • …
    corecore