108,669 research outputs found

    Successive Wyner-Ziv Coding Scheme and its Application to the Quadratic Gaussian CEO Problem

    Full text link
    We introduce a distributed source coding scheme called successive Wyner-Ziv coding. We show that any point in the rate region of the quadratic Gaussian CEO problem can be achieved via the successive Wyner-Ziv coding. The concept of successive refinement in the single source coding is generalized to the distributed source coding scenario, which we refer to as distributed successive refinement. For the quadratic Gaussian CEO problem, we establish a necessary and sufficient condition for distributed successive refinement, where the successive Wyner-Ziv coding scheme plays an important role.Comment: 28 pages, submitted to the IEEE Transactions on Information Theor

    Successive Refinement of Abstract Sources

    Get PDF
    In successive refinement of information, the decoder refines its representation of the source progressively as it receives more encoded bits. The rate-distortion region of successive refinement describes the minimum rates required to attain the target distortions at each decoding stage. In this paper, we derive a parametric characterization of the rate-distortion region for successive refinement of abstract sources. Our characterization extends Csiszar's result to successive refinement, and generalizes a result by Tuncel and Rose, applicable for finite alphabet sources, to abstract sources. This characterization spawns a family of outer bounds to the rate-distortion region. It also enables an iterative algorithm for computing the rate-distortion region, which generalizes Blahut's algorithm to successive refinement. Finally, it leads a new nonasymptotic converse bound. In all the scenarios where the dispersion is known, this bound is second-order optimal. In our proof technique, we avoid Karush-Kuhn-Tucker conditions of optimality, and we use basic tools of probability theory. We leverage the Donsker-Varadhan lemma for the minimization of relative entropy on abstract probability spaces.Comment: Extended version of a paper presented at ISIT 201

    Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information

    Full text link
    We consider the kk-user successive refinement problem with causal decoder side information and derive an exponential strong converse theorem. The rate-distortion region for the problem can be derived as a straightforward extension of the two-user case by Maor and Merhav (2008). We show that for any rate-distortion tuple outside the rate-distortion region of the kk-user successive refinement problem with causal decoder side information, the joint excess-distortion probability approaches one exponentially fast. Our proof follows by judiciously adapting the recently proposed strong converse technique by Oohama using the information spectrum method, the variational form of the rate-distortion region and H\"older's inequality. The lossy source coding problem with causal decoder side information considered by El Gamal and Weissman is a special case (k=1k=1) of the current problem. Therefore, the exponential strong converse theorem for the El Gamal and Weissman problem follows as a corollary of our result

    The sequence of conceptual information in instruction and its effect on retention

    Get PDF
    Two experiments were carried out to study the effect of the sequencing of the information in an instructional program. In both experiments, two different ordering principles were used. These principles were based on the relation between the to be learned concepts. The ordering of the information could be successive or simultaneous. The relationship between concepts is categorized either successive or coordinate. It was hypothesized that a simultaneous presentation would show better learning results than a successive presentation if between the to-be-learned concepts exists a co-ordinate relationship. A successive presentation would lead to better results in case of a successive relationship. Results suggest that the definition of both types of relationships needs refinement. Further the results show that for coordinate related concepts a simultaneous presentation is preferable

    The rate-distortion function for successive refinement of abstract sources

    Get PDF
    In successive refinement of information, the decoder refines its representation of the source progressively as it receives more encoded bits. The rate-distortion region of successive refinement describes the minimum rates required to attain the target distortions at each decoding stage. In this paper, we derive a parametric characterization of the rate-distortion region for successive refinement of abstract sources. Our characterization extends Csiszar's result [1] to successive refinement, and generalizes a result by Tuncel and Rose [2], applicable for finite alphabet sources, to abstract sources. The new characterization leads to a family of outer bounds to the rate-distortion region. It also enables new nonasymptotic converse bounds

    Multiuser Successive Refinement and Multiple Description Coding

    Full text link
    We consider the multiuser successive refinement (MSR) problem, where the users are connected to a central server via links with different noiseless capacities, and each user wishes to reconstruct in a successive-refinement fashion. An achievable region is given for the two-user two-layer case and it provides the complete rate-distortion region for the Gaussian source under the MSE distortion measure. The key observation is that this problem includes the multiple description (MD) problem (with two descriptions) as a subsystem, and the techniques useful in the MD problem can be extended to this case. We show that the coding scheme based on the universality of random binning is sub-optimal, because multiple Gaussian side informations only at the decoders do incur performance loss, in contrast to the case of single side information at the decoder. We further show that unlike the single user case, when there are multiple users, the loss of performance by a multistage coding approach can be unbounded for the Gaussian source. The result suggests that in such a setting, the benefit of using successive refinement is not likely to justify the accompanying performance loss. The MSR problem is also related to the source coding problem where each decoder has its individual side information, while the encoder has the complete set of the side informations. The MSR problem further includes several variations of the MD problem, for which the specialization of the general result is investigated and the implication is discussed.Comment: 10 pages, 5 figures. To appear in IEEE Transaction on Information Theory. References updated and typos correcte

    On rate-distortion with mixed types of side information

    Get PDF
    In this correspondence, we consider rate-distortion examples in the presence of side information. For a system with some side information known at both the encoder and decoder, and some known only at the decoder, we evaluate the rate distortion function for both Gaussian and binary sources. While the Gaussian example is a straightforward generalization of the corresponding result by Wyner, the binary example proves more difficult and is solved using a multidimensional optimization approach. Leveraging the insights gained from the binary example, we then solve the more complicated binary Heegard and Berger problem of decoding when side information may be present. The results demonstrate the existence of a new type of successive refinement in which the refinement information is decoded together with side information that is not available for the initial description

    Exact and Soft Successive Refinement of the Information Bottleneck

    Get PDF
    © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/The information bottleneck (IB) framework formalises the essential requirement for efficient information processing systems to achieve an optimal balance between the complexity of their representation and the amount of information extracted about relevant features. However, since the representation complexity affordable by real-world systems may vary in time, the processing cost of updating the representations should also be taken into account. A crucial question is thus the extent to which adaptive systems can leverage the information content of already existing IB-optimal representations for producing new ones, which target the same relevant features but at a different granularity. We investigate the information-theoretic optimal limits of this process by studying and extending, within the IB framework, the notion of successive refinement, which describes the ideal situation where no information needs to be discarded for adapting an IB-optimal representation’s granularity. Thanks in particular to a new geometric characterisation, we analytically derive the successive refinability of some specific IB problems (for binary variables, for jointly Gaussian variables, and for the relevancy variable being a deterministic function of the source variable), and provide a linear-programming-based tool to numerically investigate, in the discrete case, the successive refinement of the IB. We then soften this notion into a quantification of the loss of information optimality induced by several-stage processing through an existing measure of unique information. Simple numerical experiments suggest that this quantity is typically low, though not entirely negligible. These results could have important implications for (i) the structure and efficiency of incremental learning in biological and artificial agents, (ii) the comparison of IB-optimal observation channels in statistical decision problems, and (iii) the IB theory of deep neural networks.Peer reviewe

    On networks with side information

    Get PDF
    In this paper, we generalize the lossless coded side information problem from the three-node network of Ahlswede and K¨orner to more general network scenarios. We derive inner and outer bounds on the achievable rate region in the general network scenario and show that they are tight for some families of networks. Our approach demonstrates how solutions to canonical source coding problems can be used to derive bounds for more complex networks and reveals an interesting connection between networks with side information, successive refinement, and network coding

    On Multistage Successive Refinement for Wyner-Ziv Source Coding with Degraded Side Informations

    Get PDF
    We provide a complete characterization of the rate-distortion region for the multistage successive refinement of the Wyner-Ziv source coding problem with degraded side informations at the decoder. Necessary and sufficient conditions for a source to be successively refinable along a distortion vector are subsequently derived. A source-channel separation theorem is provided when the descriptions are sent over independent channels for the multistage case. Furthermore, we introduce the notion of generalized successive refinability with multiple degraded side informations. This notion captures whether progressive encoding to satisfy multiple distortion constraints for different side informations is as good as encoding without progressive requirement. Necessary and sufficient conditions for generalized successive refinability are given. It is shown that the following two sources are generalized successively refinable: (1) the Gaussian source with degraded Gaussian side informations, (2) the doubly symmetric binary source when the worse side information is a constant. Thus for both cases, the failure of being successively refinable is only due to the inherent uncertainty on which side information will occur at the decoder, but not the progressive encoding requirement.Comment: Submitted to IEEE Trans. Information Theory Apr. 200
    corecore