8,396 research outputs found
Gaussian mixture model-based contrast enhancement
In this study, a method for enhancing low-contrast images is proposed. This method, called Gaussian mixture model-based contrast enhancement (GMMCE), brings into play the Gaussian mixture modelling of histograms to model the content of the images. On the basis of the fact that each homogeneous area in natural images has a Gaussian-shaped histogram, it decomposes the narrow histogram of low-contrast images into a set of scaled and shifted Gaussians. The individual histograms are then stretched by increasing their variance parameters, and are diffused on the entire histogram by scattering their mean parameters, to build a broad version of the histogram. The number of Gaussians as well as their parameters are optimised to set up a Gaussian mixture modelling with lowest approximation error and highest similarity to the original histogram. Compared with the existing histogram-based methods, the experimental results show that the quality of GMMCE enhanced pictures are mostly consistent and outperform other benchmark methods. Additionally, the computational complexity analysis shows that GMMCE is a low-complexity method
Contrast Enhancement of Brightness-Distorted Images by Improved Adaptive Gamma Correction
As an efficient image contrast enhancement (CE) tool, adaptive gamma
correction (AGC) was previously proposed by relating gamma parameter with
cumulative distribution function (CDF) of the pixel gray levels within an
image. ACG deals well with most dimmed images, but fails for globally bright
images and the dimmed images with local bright regions. Such two categories of
brightness-distorted images are universal in real scenarios, such as improper
exposure and white object regions. In order to attenuate such deficiencies,
here we propose an improved AGC algorithm. The novel strategy of negative
images is used to realize CE of the bright images, and the gamma correction
modulated by truncated CDF is employed to enhance the dimmed ones. As such,
local over-enhancement and structure distortion can be alleviated. Both
qualitative and quantitative experimental results show that our proposed method
yields consistently good CE results
Sliced Wasserstein Distance for Learning Gaussian Mixture Models
Gaussian mixture models (GMM) are powerful parametric tools with many
applications in machine learning and computer vision. Expectation maximization
(EM) is the most popular algorithm for estimating the GMM parameters. However,
EM guarantees only convergence to a stationary point of the log-likelihood
function, which could be arbitrarily worse than the optimal solution. Inspired
by the relationship between the negative log-likelihood function and the
Kullback-Leibler (KL) divergence, we propose an alternative formulation for
estimating the GMM parameters using the sliced Wasserstein distance, which
gives rise to a new algorithm. Specifically, we propose minimizing the
sliced-Wasserstein distance between the mixture model and the data distribution
with respect to the GMM parameters. In contrast to the KL-divergence, the
energy landscape for the sliced-Wasserstein distance is more well-behaved and
therefore more suitable for a stochastic gradient descent scheme to obtain the
optimal GMM parameters. We show that our formulation results in parameter
estimates that are more robust to random initializations and demonstrate that
it can estimate high-dimensional data distributions more faithfully than the EM
algorithm
- …