167 research outputs found
Rate and distortion redundancies for universal source coding with respect to a fidelity criterion
Rissanen has shown that there exist universal noiseless codes for {Xi} with per-letter rate redundancy as low as (K log N)/2N, where N is the blocklength and K is the number of source parameters. we derive an analogous result for universal source coding with respect to the squared error fidelity criterion: there exist codes with per-letter rate redundancy as low as (K log N)/2N and per-letter distortion (averaged over X^N and θ) at most D(R)[1 + K/N], where D(r) is an average distortion-rate function and K is now the number of parameters in the code
Application of Kolmogorov complexity and universal codes to identity testing and nonparametric testing of serial independence for time series
We show that Kolmogorov complexity and such its estimators as universal codes
(or data compression methods) can be applied for hypotheses testing in a
framework of classical mathematical statistics. The methods for identity
testing and nonparametric testing of serial independence for time series are
suggested.Comment: submitte
Handwritten digit recognition by bio-inspired hierarchical networks
The human brain processes information showing learning and prediction
abilities but the underlying neuronal mechanisms still remain unknown.
Recently, many studies prove that neuronal networks are able of both
generalizations and associations of sensory inputs. In this paper, following a
set of neurophysiological evidences, we propose a learning framework with a
strong biological plausibility that mimics prominent functions of cortical
circuitries. We developed the Inductive Conceptual Network (ICN), that is a
hierarchical bio-inspired network, able to learn invariant patterns by
Variable-order Markov Models implemented in its nodes. The outputs of the
top-most node of ICN hierarchy, representing the highest input generalization,
allow for automatic classification of inputs. We found that the ICN clusterized
MNIST images with an error of 5.73% and USPS images with an error of 12.56%
Empirical Bayes and Full Bayes for Signal Estimation
We consider signals that follow a parametric distribution where the parameter
values are unknown. To estimate such signals from noisy measurements in scalar
channels, we study the empirical performance of an empirical Bayes (EB)
approach and a full Bayes (FB) approach. We then apply EB and FB to solve
compressed sensing (CS) signal estimation problems by successively denoising a
scalar Gaussian channel within an approximate message passing (AMP) framework.
Our numerical results show that FB achieves better performance than EB in
scalar channel denoising problems when the signal dimension is small. In the CS
setting, the signal dimension must be large enough for AMP to work well; for
large signal dimensions, AMP has similar performance with FB and EB.Comment: This work was presented at the Information Theory and Application
workshop (ITA), San Diego, CA, Feb. 201
Results on the Redundancy of Universal Compression for Finite-Length Sequences
In this paper, we investigate the redundancy of universal coding schemes on
smooth parametric sources in the finite-length regime. We derive an upper bound
on the probability of the event that a sequence of length , chosen using
Jeffreys' prior from the family of parametric sources with unknown
parameters, is compressed with a redundancy smaller than
for any . Our results also confirm
that for large enough and , the average minimax redundancy provides a
good estimate for the redundancy of most sources. Our result may be used to
evaluate the performance of universal source coding schemes on finite-length
sequences. Additionally, we precisely characterize the minimax redundancy for
two--stage codes. We demonstrate that the two--stage assumption incurs a
negligible redundancy especially when the number of source parameters is large.
Finally, we show that the redundancy is significant in the compression of small
sequences.Comment: accepted in the 2011 IEEE International Symposium on Information
Theory (ISIT 2011
- …