1 research outputs found

    Universal Scanning and Sequential Decision Making for Multidimensional Data

    No full text
    Abstract — We investigate several problems in scanning of multidimensional data arrays, such as universal scanning and prediction (“scandiction”, for short), and scandiction of noisy data arrays. These problems arise in several aspects of image and video processing, such as predictive coding, filtering and denoising. In predictive coding of images, for example, an image is compressed by coding the prediction error sequence resulting from scandicting it. Thus, it is natural to ask what is the optimal method to scan and predict a given image, what is the resulting minimum prediction loss, and if there exist specific scandiction schemes which are universal in some sense. More specifically, we investigate the following problems: First, given a random field, we examine whether there exists a scandiction scheme which is independent of the field’s distribution, yet asymptotically achieves the same performance as if this distribution was known. This question is answered in the affirmative for the set of all spatially stationary random fields and under mild conditions on the loss function. We then discuss the scenario where a non-optimal scanning order is used, yet accompanied by an optimal predictor, and derive a bound on the excess loss compared to optimal scandiction. Finally, we examine the scenario where the random field is corrupted by noise, but the scanning and prediction (or filtering) scheme is judged with respect to the underlying noiseless field. I
    corecore