96,750 research outputs found
Contrast Enhancement of Brightness-Distorted Images by Improved Adaptive Gamma Correction
As an efficient image contrast enhancement (CE) tool, adaptive gamma
correction (AGC) was previously proposed by relating gamma parameter with
cumulative distribution function (CDF) of the pixel gray levels within an
image. ACG deals well with most dimmed images, but fails for globally bright
images and the dimmed images with local bright regions. Such two categories of
brightness-distorted images are universal in real scenarios, such as improper
exposure and white object regions. In order to attenuate such deficiencies,
here we propose an improved AGC algorithm. The novel strategy of negative
images is used to realize CE of the bright images, and the gamma correction
modulated by truncated CDF is employed to enhance the dimmed ones. As such,
local over-enhancement and structure distortion can be alleviated. Both
qualitative and quantitative experimental results show that our proposed method
yields consistently good CE results
Image quality assessment based on harmonics gain/loss information
We present an objective reduced-reference image quality assessment method based on harmonic gain/loss information through a discriminative analysis of local harmonic strength (LHS). The LHS is computed from the gradient of images, and its value represents a relative degree of the appearance of blockiness on images when it is related to energy gain within an image. Furthermore, comparison between local harmonic strength values from an original, distortion-free image and a degraded, processed, or compressed version of the image shows that the LHS can also be used to indicate other types of degradations, such as blurriness that corresponds with energy loss. Our simulations show that we can develop a single metric based on this gain/loss information and use it to rate the quality of images encoded by various encoders such as DCT-based JPEG, wavelet-based JPEG 2000, or various processed images. We show that our method can overcome some limitations of the traditional PSNR
Image Quality Assessment Based on Detail Differences
This paper presents a novel Full Reference method for
image quality assessment based on two indices measuring
respectively detail loss and spurious detail addition. These
indices define a two dimensional (2D) state in a Virtual
Cognitive State (VCS) space. The quality estimation is
obtained as a 2D function of the VCS, empirically
determined via polynomial fitting of DMOS values of
training images. The method provides at the same time
highly accurate DMOS estimates, and a quantitative
account of the causes of quality degradation
No-reference Image Denoising Quality Assessment
A wide variety of image denoising methods are available now. However, the
performance of a denoising algorithm often depends on individual input noisy
images as well as its parameter setting. In this paper, we present a
no-reference image denoising quality assessment method that can be used to
select for an input noisy image the right denoising algorithm with the optimal
parameter setting. This is a challenging task as no ground truth is available.
This paper presents a data-driven approach to learn to predict image denoising
quality. Our method is based on the observation that while individual existing
quality metrics and denoising models alone cannot robustly rank denoising
results, they often complement each other. We accordingly design denoising
quality features based on these existing metrics and models and then use Random
Forests Regression to aggregate them into a more powerful unified metric. Our
experiments on images with various types and levels of noise show that our
no-reference denoising quality assessment method significantly outperforms the
state-of-the-art quality metrics. This paper also provides a method that
leverages our quality assessment method to automatically tune the parameter
settings of a denoising algorithm for an input noisy image to produce an
optimal denoising result.Comment: 17 pages, 41 figures, accepted by Computer Vision Conference (CVC)
201
- …