14 research outputs found

    Multiple-Description Coding by Dithered Delta-Sigma Quantization

    Get PDF
    We address the connection between the multiple-description (MD) problem and Delta-Sigma quantization. The inherent redundancy due to oversampling in Delta-Sigma quantization, and the simple linear-additive noise model resulting from dithered lattice quantization, allow us to construct a symmetric and time-invariant MD coding scheme. We show that the use of a noise shaping filter makes it possible to trade off central distortion for side distortion. Asymptotically as the dimension of the lattice vector quantizer and order of the noise shaping filter approach infinity, the entropy rate of the dithered Delta-Sigma quantization scheme approaches the symmetric two-channel MD rate-distortion function for a memoryless Gaussian source and MSE fidelity criterion, at any side-to-central distortion ratio and any resolution. In the optimal scheme, the infinite-order noise shaping filter must be minimum phase and have a piece-wise flat power spectrum with a single jump discontinuity. An important advantage of the proposed design is that it is symmetric in rate and distortion by construction, so the coding rates of the descriptions are identical and there is therefore no need for source splitting.Comment: Revised, restructured, significantly shortened and minor typos has been fixed. Accepted for publication in the IEEE Transactions on Information Theor

    Sampling versus Random Binning for Multiple Descriptions of a Bandlimited Source

    Get PDF
    Random binning is an efficient, yet complex, coding technique for the symmetric L-description source coding problem. We propose an alternative approach, that uses the quantized samples of a bandlimited source as "descriptions". By the Nyquist condition, the source can be reconstructed if enough samples are received. We examine a coding scheme that combines sampling and noise-shaped quantization for a scenario in which only K < L descriptions or all L descriptions are received. Some of the received K-sets of descriptions correspond to uniform sampling while others to non-uniform sampling. This scheme achieves the optimum rate-distortion performance for uniform-sampling K-sets, but suffers noise amplification for nonuniform-sampling K-sets. We then show that by increasing the sampling rate and adding a random-binning stage, the optimal operation point is achieved for any K-set.Comment: Presented at the ITW'13. 5 pages, two-column mode, 3 figure

    Index assignment for multiple description repair in distributed storage systems

    Get PDF
    Distributed storage systems have been receiving increasing attention lately due to the developments in cloud and grid computing. Furthermore, a major part of the stored information comprises of multimedia, whose content can be communicated even with a lossy (non-perfect) reconstruction. In this context, Multiple Description Lattice Quantizers (MDLQ) can be employed to encode such sources for distributed storage and store them across distributed nodes. Their inherent properties yield that having access to all nodes gives perfect reconstruction of the source, while the reconstruction quality decreases gracefully with fewer available nodes. If a set of nodes fails, lossy repair techniques could be applied to reconstruct the failed nodes from the available ones. This problem has mostly been studied with the lossless (perfect) reconstruction assumption. In this work, a general model, Multiple Description Lattice Quantizer with Repairs (MDLQR), is introduced that encompasses the lossy repair problem for distributed storage applications. New performance measures and repair techniques are introduced for MDLQR, and a non-trivial identity is derived, which is related to other results in the literature. This enables us to find the optimal encoder for a certain repair technique used in the MDLQR. Furthermore, simulation results are used to evaluate the performance of the different repair techniques. © 2014 IEEE

    Multi-Tenant C-RAN With Spectrum Pooling: Downlink Optimization Under Privacy Constraints

    Get PDF
    Spectrum pooling allows multiple operators, or tenants, to share the same frequency bands. This work studies the optimization of spectrum pooling for the downlink of a multi-tenant Cloud Radio Access Network (C-RAN) system in the presence of inter-tenant privacy constraints. The spectrum available for downlink transmission is partitioned into private and shared subbands, and the participating operators cooperate to serve the user equipments (UEs) on the shared subband. The network of each operator consists of a cloud processor (CP) that is connected to proprietary radio units (RUs) by means of finite-capacity fronthaul links. In order to enable interoperator cooperation, the CPs of the participating operators are also connected by finite-capacity backhaul links. Inter-operator cooperation may hence result in loss of privacy. Fronthaul and backhaul links are used to transfer quantized baseband signals. Standard quantization is considered first. Then, a novel approach based on the idea of correlating quantization noise signals across RUs of different operators is proposed to control the trade-off between distortion at UEs and inter-operator privacy. The problem of optimizing the bandwidth allocation, precoding, and fronthaul/backhaul compression strategies is tackled under constraints on backhaul and fronthaul capacity, as well as on per-RU transmit power and inter-operator privacy. For both cases, the optimization problems are tackled using the concave convex procedure (CCCP), and extensive numerical results are provided.Comment: Submitted, 24 pages, 7 figure

    Colored-Gaussian Multiple Descriptions: Spectral and Time-Domain Forms

    Get PDF
    It is well known that Shannon's rate-distortion function (RDF) in the colored quadratic Gaussian (QG) case can be parametrized via a single Lagrangian variable (the "water level" in the reverse water filling solution). In this work, we show that the symmetric colored QG multiple-description (MD) RDF in the case of two descriptions can be parametrized in the spectral domain via two Lagrangian variables, which control the trade-off between the side distortion, the central distortion, and the coding rate. This spectral-domain analysis is complemented by a time-domain scheme-design approach: we show that the symmetric colored QG MD RDF can be achieved by combining ideas of delta-sigma modulation and differential pulse-code modulation. Specifically, two source prediction loops, one for each description, are embedded within a common noise shaping loop, whose parameters are explicitly found from the spectral-domain characterization.Comment: Accepted for publications in the IEEE Transactions on Information Theory. Title have been shortened, abstract clarified, and paper significantly restructure

    Zero-Delay Multiple Descriptions of Stationary Scalar Gauss-Markov Sources

    Get PDF
    In this paper, we introduce the zero-delay multiple-description problem, where an encoder constructs two descriptions and the decoders receive a subset of these descriptions. The encoder and decoders are causal and operate under the restriction of zero delay, which implies that at each time instance, the encoder must generate codewords that can be decoded by the decoders using only the current and past codewords. For the case of discrete-time stationary scalar Gauss&mdash;Markov sources and quadratic distortion constraints, we present information-theoretic lower bounds on the average sum-rate in terms of the directed and mutual information rate between the source and the decoder reproductions. Furthermore, we show that the optimum test channel is in this case Gaussian, and it can be realized by a feedback coding scheme that utilizes prediction and correlated Gaussian noises. Operational achievable results are considered in the high-rate scenario using a simple differential pulse code modulation scheme with staggered quantizers. Using this scheme, we achieve operational rates within 0.415 bits / sample / description of the theoretical lower bounds for varying description rates

    n-Channel Asymmetric Entropy-Constrained Multiple-Description Lattice Vector Quantization

    Get PDF
    This paper is about the design and analysis of an index-assignment (IA) based multiple-description coding scheme for the n-channel asymmetric case. We use entropy constrained lattice vector quantization and restrict attention to simple reconstruction functions, which are given by the inverse IA function when all descriptions are received or otherwise by a weighted average of the received descriptions. We consider smooth sources with finite differential entropy rate and MSE fidelity criterion. As in previous designs, our construction is based on nested lattices which are combined through a single IA function. The results are exact under high-resolution conditions and asymptotically as the nesting ratios of the lattices approach infinity. For any n, the design is asymptotically optimal within the class of IA-based schemes. Moreover, in the case of two descriptions and finite lattice vector dimensions greater than one, the performance is strictly better than that of existing designs. In the case of three descriptions, we show that in the limit of large lattice vector dimensions, points on the inner bound of Pradhan et al. can be achieved. Furthermore, for three descriptions and finite lattice vector dimensions, we show that the IA-based approach yields, in the symmetric case, a smaller rate loss than the recently proposed source-splitting approach.Comment: 49 pages, 4 figures. Accepted for publication in IEEE Transactions on Information Theory, 201

    Incremental Refinements and Multiple Descriptions with Feedback

    Get PDF
    It is well known that independent (separate) encoding of K correlated sources may incur some rate loss compared to joint encoding, even if the decoding is done jointly. This loss is particularly evident in the multiple descriptions problem, where the sources are repetitions of the same source, but each description must be individually good. We observe that under mild conditions about the source and distortion measure, the rate ratio Rindependent(K)/Rjoint goes to one in the limit of small rate/high distortion. Moreover, we consider the excess rate with respect to the rate-distortion function, Rindependent(K, M) - R(D), in M rounds of K independent encodings with a final distortion level D. We provide two examples - a Gaussian source with mean-squared error and an exponential source with one-sided error - for which the excess rate vanishes in the limit as the number of rounds M goes to infinity, for any fixed D and K. This result has an interesting interpretation for a multi-round variant of the multiple descriptions problem, where after each round the encoder gets a (block) feedback regarding which of the descriptions arrived: In the limit as the number of rounds M goes to infinity (i.e., many incremental rounds), the total rate of received descriptions approaches the rate-distortion function. We provide theoretical and experimental evidence showing that this phenomenon is in fact more general than in the two examples above.Comment: 62 pages. Accepted in the IEEE Transactions on Information Theor
    corecore