1,459 research outputs found
Reduced-reference metric design for objective perceptual quality assessment in wireless imaging
The rapid growth of third and development of future generation mobile systems has led
to an increase in the demand for image and video services. However, the hostile nature
of the wireless channel makes the deployment of such services much more challenging,
as in the case of a wireline system. In this context, the importance of taking care of user
satisfaction with service provisioning as a whole has been recognized. The related useroriented
quality concepts cover end-to-end quality of service and subjective factors such
as experiences with the service. To monitor quality and adapt system resources,
performance indicators that represent service integrity have to be selected and related
to objective measures that correlate well with the quality as perceived by humans. Such
objective perceptual quality metrics can then be utilized to optimize quality perception
associated with applications in technical systems.
In this paper, we focus on the design of reduced-reference objective perceptual
image quality metrics for use in wireless imaging. Specifically, the normalized hybrid
image quality metric (NHlQM) and a perceptual relevance weighted Lp-norm are
designed. The main idea behind both feature-based metrics relates to the fact that the
human visual system (HVS)is trained to extract structural information from the viewing
area. Accordingly, NHlQMand Lp-norm are designed to account for different structural
artifacts that have been observed in our distortion model of a wireless link. The extent
by which individual artifacts are present in a given image is obtained by measuring
related image features. The overall quality measure is then computed as a weighting
sum of the features with the respective perceptual relevance weight obtained from
subjective experiments. The proposed metrics differ mainly in the pooling of the
features and amount of reduced-reference produced. While NHlQM performs the
pooling at the transmitter of the system to produce a single value as reduced-reference,
the Lp-norm requires all involved feature values from the transmitted and received
image to perform the pooling on the feature differences at the receiver. In addition, nonlinear
mapping functions are developed that relate the metric values to predicted mean
opinion scores (MOS) and account for saturations in the HVS. The evaluation of
prediction performance of NHIQM and the Lp-norm reveals their excellent correlation
with human perception in terms of accuracy, monotonicity, and consistency
Convolutional Deblurring for Natural Imaging
In this paper, we propose a novel design of image deblurring in the form of
one-shot convolution filtering that can directly convolve with naturally
blurred images for restoration. The problem of optical blurring is a common
disadvantage to many imaging applications that suffer from optical
imperfections. Despite numerous deconvolution methods that blindly estimate
blurring in either inclusive or exclusive forms, they are practically
challenging due to high computational cost and low image reconstruction
quality. Both conditions of high accuracy and high speed are prerequisites for
high-throughput imaging platforms in digital archiving. In such platforms,
deblurring is required after image acquisition before being stored, previewed,
or processed for high-level interpretation. Therefore, on-the-fly correction of
such images is important to avoid possible time delays, mitigate computational
expenses, and increase image perception quality. We bridge this gap by
synthesizing a deconvolution kernel as a linear combination of Finite Impulse
Response (FIR) even-derivative filters that can be directly convolved with
blurry input images to boost the frequency fall-off of the Point Spread
Function (PSF) associated with the optical blur. We employ a Gaussian low-pass
filter to decouple the image denoising problem for image edge deblurring.
Furthermore, we propose a blind approach to estimate the PSF statistics for two
Gaussian and Laplacian models that are common in many imaging pipelines.
Thorough experiments are designed to test and validate the efficiency of the
proposed method using 2054 naturally blurred images across six imaging
applications and seven state-of-the-art deconvolution methods.Comment: 15 pages, for publication in IEEE Transaction Image Processin
Evaluación objetiva de la influencia del canal inalámbrico en la calidad de la imagen
The main part of this project is the simulation of radio communication channel which includes: JPEG coder/decoder, BPSK modulator/demodulator, AWGN channel and additional matlab boxes. The purpose of this project was to see how the parameters and characteristics of radio communication channel make the influence on the image. An image assessment with objective and subjective metrics was made on random base of images. Validations of these results were shown on another base of images WIQ, where images had typical distortions for a radio communication channel. It was shown that objective metrics does not always correlate with subjective metrics. Human visual system is still an unexplored task. It is still not impossible to make the mathematical model of assessment that works and assess like human visual system. Objective methods cost less and it is easier to perform them while subjective methods take more time and results cannot be predicted. It is not possible to say which method has more effective results because both methods are very important for evaluation of image quality.Escuela Técnica Superior de Ingeniería de TelecomunicaciónUniversidad Politécnica de Cartagen
- …