12,575 research outputs found
Reordering Rows for Better Compression: Beyond the Lexicographic Order
Sorting database tables before compressing them improves the compression
rate. Can we do better than the lexicographical order? For minimizing the
number of runs in a run-length encoding compression scheme, the best approaches
to row-ordering are derived from traveling salesman heuristics, although there
is a significant trade-off between running time and compression. A new
heuristic, Multiple Lists, which is a variant on Nearest Neighbor that trades
off compression for a major running-time speedup, is a good option for very
large tables. However, for some compression schemes, it is more important to
generate long runs rather than few runs. For this case, another novel
heuristic, Vortex, is promising. We find that we can improve run-length
encoding up to a factor of 3 whereas we can improve prefix coding by up to 80%:
these gains are on top of the gains due to lexicographically sorting the table.
We prove that the new row reordering is optimal (within 10%) at minimizing the
runs of identical values within columns, in a few cases.Comment: to appear in ACM TOD
Subset-lex: did we miss an order?
We generalize a well-known algorithm for the generation of all subsets of a
set in lexicographic order with respect to the sets as lists of elements
(subset-lex order). We obtain algorithms for various combinatorial objects such
as the subsets of a multiset, compositions and partitions represented as lists
of parts, and for certain restricted growth strings. The algorithms are often
loopless and require at most one extra variable for the computation of the next
object. The performance of the algorithms is very competitive even when not
loopless. A Gray code corresponding to the subset-lex order and a Gray code for
compositions that was found during this work are described.Comment: Two obvious errors corrected (indicated by "Correction:" in the LaTeX
source
Prediction error image coding using a modified stochastic vector quantization scheme
The objective of this paper is to provide an efficient and yet simple method to encode the prediction error image of video sequences, based on a stochastic vector quantization (SVQ) approach that has been modified to cope with the intrinsic decorrelated nature of the prediction error image of video signals. In the SVQ scheme, the codewords are generated by stochastic techniques instead of being generated by a training set representative of the expected input image as is normal use in VQ. The performance of the scheme is shown for the particular case of segmentation-based video coding although the technique can be also applied to motion-compensated hybrid coding schemes.Peer ReviewedPostprint (published version
Morphological operators for very low bit rate video coding
This paper deals with the use of some morphological tools for video coding at very low bit rates. Rather than describing a complete coding algorithm, the purpose of this paper is to focus on morphological connected operators and segmentation tools that have proved to be attractive for compression.Peer ReviewedPostprint (published version
- …