773 research outputs found
Multiple-Description Coding by Dithered Delta-Sigma Quantization
We address the connection between the multiple-description (MD) problem and
Delta-Sigma quantization. The inherent redundancy due to oversampling in
Delta-Sigma quantization, and the simple linear-additive noise model resulting
from dithered lattice quantization, allow us to construct a symmetric and
time-invariant MD coding scheme. We show that the use of a noise shaping filter
makes it possible to trade off central distortion for side distortion.
Asymptotically as the dimension of the lattice vector quantizer and order of
the noise shaping filter approach infinity, the entropy rate of the dithered
Delta-Sigma quantization scheme approaches the symmetric two-channel MD
rate-distortion function for a memoryless Gaussian source and MSE fidelity
criterion, at any side-to-central distortion ratio and any resolution. In the
optimal scheme, the infinite-order noise shaping filter must be minimum phase
and have a piece-wise flat power spectrum with a single jump discontinuity. An
important advantage of the proposed design is that it is symmetric in rate and
distortion by construction, so the coding rates of the descriptions are
identical and there is therefore no need for source splitting.Comment: Revised, restructured, significantly shortened and minor typos has
been fixed. Accepted for publication in the IEEE Transactions on Information
Theor
Source Coding Optimization for Distributed Average Consensus
Consensus is a common method for computing a function of the data distributed
among the nodes of a network. Of particular interest is distributed average
consensus, whereby the nodes iteratively compute the sample average of the data
stored at all the nodes of the network using only near-neighbor communications.
In real-world scenarios, these communications must undergo quantization, which
introduces distortion to the internode messages. In this thesis, a model for
the evolution of the network state statistics at each iteration is developed
under the assumptions of Gaussian data and additive quantization error. It is
shown that minimization of the communication load in terms of aggregate source
coding rate can be posed as a generalized geometric program, for which an
equivalent convex optimization can efficiently solve for the global minimum.
Optimization procedures are developed for rate-distortion-optimal vector
quantization, uniform entropy-coded scalar quantization, and fixed-rate uniform
quantization. Numerical results demonstrate the performance of these
approaches. For small numbers of iterations, the fixed-rate optimizations are
verified using exhaustive search. Comparison to the prior art suggests
competitive performance under certain circumstances but strongly motivates the
incorporation of more sophisticated coding strategies, such as differential,
predictive, or Wyner-Ziv coding.Comment: Master's Thesis, Electrical Engineering, North Carolina State
Universit
Semantic networks
AbstractA semantic network is a graph of the structure of meaning. This article introduces semantic network systems and their importance in Artificial Intelligence, followed by I. the early background; II. a summary of the basic ideas and issues including link types, frame systems, case relations, link valence, abstraction, inheritance hierarchies and logic extensions; and III. a survey of ‘world-structuring’ systems including ontologies, causal link models, continuous models, relevance, formal dictionaries, semantic primitives and intersecting inference hierarchies. Speed and practical implementation are briefly discussed. The conclusion argues for a synthesis of relational graph theory, graph-grammar theory and order theory based on semantic primitives and multiple intersecting inference hierarchies
- …