114,544 research outputs found
An efficient rate control algorithm for a wavelet video codec
Rate control plays an essential role in video coding and transmission to provide the best video quality at the receiver's end given the constraint of certain network conditions. In this paper, a rate control algorithm using the Quality Factor (QF) optimization method is proposed for the wavelet-based video codec and implemented on an open source Dirac video encoder. A mathematical model which we call Rate-QF (R - QF) model is derived to generate the optimum QF for the current coding frame according to the target bitrate. The proposed algorithm is a complete one pass process and does not require complex mathematical calculation. The process of calculating the QF is quite simple and further calculation is not required for each coded frame. The experimental results show that the proposed algorithm can control the bitrate precisely (within 1% of target bitrate in average). Moreover, the variation of bitrate over each Group of Pictures (GOPs) is lower than that of H.264. This is an advantage in preventing the buffer overflow and underflow for real-time multimedia data streaming
Fusing Censored Dependent Data for Distributed Detection
In this paper, we consider a distributed detection problem for a censoring
sensor network where each sensor's communication rate is significantly reduced
by transmitting only "informative" observations to the Fusion Center (FC), and
censoring those deemed "uninformative". While the independence of data from
censoring sensors is often assumed in previous research, we explore spatial
dependence among observations. Our focus is on designing the fusion rule under
the Neyman-Pearson (NP) framework that takes into account the spatial
dependence among observations. Two transmission scenarios are considered, one
where uncensored observations are transmitted directly to the FC and second
where they are first quantized and then transmitted to further improve
transmission efficiency. Copula-based Generalized Likelihood Ratio Test (GLRT)
for censored data is proposed with both continuous and discrete messages
received at the FC corresponding to different transmission strategies. We
address the computational issues of the copula-based GLRTs involving
multidimensional integrals by presenting more efficient fusion rules, based on
the key idea of injecting controlled noise at the FC before fusion. Although,
the signal-to-noise ratio (SNR) is reduced by introducing controlled noise at
the receiver, simulation results demonstrate that the resulting noise-aided
fusion approach based on adding artificial noise performs very closely to the
exact copula-based GLRTs. Copula-based GLRTs and their noise-aided counterparts
by exploiting the spatial dependence greatly improve detection performance
compared with the fusion rule under independence assumption
- …