University of Warwick. Department of Computer Science
Abstract
Modeling of the operational rate-distortion characteristics of a signal can significantly reduce the computational complexity of an optimal bit allocation algorithm. In this report, such models are studied