2 research outputs found

    An Automated Algorithm for Approximation of Temporal Video Data Using Linear B'EZIER Fitting

    Full text link
    This paper presents an efficient method for approximation of temporal video data using linear Bezier fitting. For a given sequence of frames, the proposed method estimates the intensity variations of each pixel in temporal dimension using linear Bezier fitting in Euclidean space. Fitting of each segment ensures upper bound of specified mean squared error. Break and fit criteria is employed to minimize the number of segments required to fit the data. The proposed method is well suitable for lossy compression of temporal video data and automates the fitting process of each pixel. Experimental results show that the proposed method yields good results both in terms of objective and subjective quality measurement parameters without causing any blocking artifacts.Comment: 14 Pages, IJMA 201

    On Spatial Adaptation of Motion Field Smoothness in Video Coding

    No full text
    Most motion compensation methods dealt with in the literature make strong assumptions about the smoothness of the underlying motion field. For instance, block-matching algorithms assume a blockwiseconstant motion field and are adequate for translational motion models; control grid interpolation assumes a block-wise bilinear motion field, and captures zooming and warping fairly well. Time-varying imagery however, often contains both types of motion (as well as others) and hence exhibits a high degree of spatial variability of its motion field smoothness properties. We develop a simple method to spatially adapt the smoothness of the motion field. The proposed method demonstrates substantial improvements in video quality over a wide range of bit rates. To this end, we introduce the notion of a motion field that is characterized by a set of labels. The labels provide the flexibility to adaptively switch between two di#erent motion models locally. The individual motion models have very di#e..
    corecore