10 research outputs found

    Multiple Description Quantization via Gram-Schmidt Orthogonalization

    Full text link
    The multiple description (MD) problem has received considerable attention as a model of information transmission over unreliable channels. A general framework for designing efficient multiple description quantization schemes is proposed in this paper. We provide a systematic treatment of the El Gamal-Cover (EGC) achievable MD rate-distortion region, and show that any point in the EGC region can be achieved via a successive quantization scheme along with quantization splitting. For the quadratic Gaussian case, the proposed scheme has an intrinsic connection with the Gram-Schmidt orthogonalization, which implies that the whole Gaussian MD rate-distortion region is achievable with a sequential dithered lattice-based quantization scheme as the dimension of the (optimal) lattice quantizers becomes large. Moreover, this scheme is shown to be universal for all i.i.d. smooth sources with performance no worse than that for an i.i.d. Gaussian source with the same variance and asymptotically optimal at high resolution. A class of low-complexity MD scalar quantizers in the proposed general framework also is constructed and is illustrated geometrically; the performance is analyzed in the high resolution regime, which exhibits a noticeable improvement over the existing MD scalar quantization schemes.Comment: 48 pages; submitted to IEEE Transactions on Information Theor

    Optimal Filter Banks for Multiple Description Coding: Analysis and Synthesis

    Get PDF
    Multiple description (MD) coding is a source coding technique for information transmission over unreliable networks. In MD coding, the coder generates several different descriptions of the same signal and the decoder can produce a useful reconstruction of the source with any received subset of these descriptions. In this paper, we study the problem of MD coding of stationary Gaussian sources with memory. First, we compute an approximate MD rate distortion region for these sources, which we prove to be asymptotically tight at high rates. This region generalizes the MD rate distortion region of El Gamal and Cover (1982), and Ozarow (1980) for memoryless Gaussian sources. Then, we develop an algorithm for the design of optimal two-channel biorthogonal filter banks for MD coding of Gaussian sources. We show that optimal filters are obtained by allocating the redundancy over frequency with a reverse "water-filling" strategy. Finally, we present experimental results which show the effectiveness of our filter banks in the low complexity, low rate regim

    Index Assignment for N Balanced Multiple Description Scalar Quantization

    Get PDF
    In this paper, we address the design of any number of balanced multiple descriptions using the multiple description scalar quantization(MDSQ) technique. The proposed scheme has the advantages of low complexity, the possibility of being extended easily to any number of descriptions and the possibility to trade off between the side, partial and central distortions. Unlike existing schemes, it can produce balanced descriptions at low rates, at the price however of a slightly higher distortion. The behavior of the proposed index assignment at high rate is in the same time similar to state-of-the-art schemes. The proposed scheme offers the possibility to adapt to loss probability, and rate constraints, in playing with both the number of descriptions, and the rate of each of them, to minimize the average distortion. The comparison with the systematic FEC (N; k) scheme shows that the FEC scheme in general gives smaller average distortion, but that our scheme seems to be more robust to sudden changes in network conditions and that receiving all the descriptions in general gives smaller distortions

    n-Channel Asymmetric Entropy-Constrained Multiple-Description Lattice Vector Quantization

    Get PDF
    This paper is about the design and analysis of an index-assignment (IA) based multiple-description coding scheme for the n-channel asymmetric case. We use entropy constrained lattice vector quantization and restrict attention to simple reconstruction functions, which are given by the inverse IA function when all descriptions are received or otherwise by a weighted average of the received descriptions. We consider smooth sources with finite differential entropy rate and MSE fidelity criterion. As in previous designs, our construction is based on nested lattices which are combined through a single IA function. The results are exact under high-resolution conditions and asymptotically as the nesting ratios of the lattices approach infinity. For any n, the design is asymptotically optimal within the class of IA-based schemes. Moreover, in the case of two descriptions and finite lattice vector dimensions greater than one, the performance is strictly better than that of existing designs. In the case of three descriptions, we show that in the limit of large lattice vector dimensions, points on the inner bound of Pradhan et al. can be achieved. Furthermore, for three descriptions and finite lattice vector dimensions, we show that the IA-based approach yields, in the symmetric case, a smaller rate loss than the recently proposed source-splitting approach.Comment: 49 pages, 4 figures. Accepted for publication in IEEE Transactions on Information Theory, 201

    n-channel entropy-constrained multiple-description lattice vector quantization

    Full text link
    corecore