62,229 research outputs found
Conditional weighted universal source codes: second order statistics in universal coding
We consider the use of second order statistics in two-stage universal source coding. Examples of two-stage universal codes include the weighted universal vector quantization (WUVQ), weighted universal bit allocation (WUBA), and weighted universal transform coding (WUTC) algorithms. The second order statistics are incorporated in two-stage universal source codes in a manner analogous to the method by which second order statistics are incorporated in entropy constrained vector quantization (ECVQ) to yield conditional ECVQ (CECVQ). In this paper, we describe an optimal two-stage conditional entropy constrained universal source code along with its associated optimal design algorithm and a fast (but nonoptimal) variation of the original code. The design technique and coding algorithm here presented result in a new family of conditional entropy constrained universal codes including but not limited to the conditional entropy constrained WUVQ (CWUVQ), the conditional entropy constrained WUBA (CWUBA), and the conditional entropy constrained WUTC (CWUTC). The fast variation of the conditional entropy constrained universal codes allows the designer to trade off performance gains against storage and delay costs. We demonstrate the performance of the proposed codes on a collection of medical brain scans. On the given data set, the CWUVQ achieves up to 7.5 dB performance improvement over variable-rate WUVQ and up to 12 dB performance improvement over ECVQ. On the same data set, the fast variation of the CWUVQ achieves identical performance to that achieved by the original code at all but the lowest rates (less than 0.125 bits per pixel)
Rewriting Codes for Joint Information Storage in Flash Memories
Memories whose storage cells transit irreversibly between
states have been common since the start of the data storage
technology. In recent years, flash memories have become a very
important family of such memories. A flash memory cell has q
states—state 0.1.....q-1 - and can only transit from a lower
state to a higher state before the expensive erasure operation takes
place. We study rewriting codes that enable the data stored in a
group of cells to be rewritten by only shifting the cells to higher
states. Since the considered state transitions are irreversible, the
number of rewrites is bounded. Our objective is to maximize the
number of times the data can be rewritten. We focus on the joint
storage of data in flash memories, and study two rewriting codes
for two different scenarios. The first code, called floating code, is for
the joint storage of multiple variables, where every rewrite changes
one variable. The second code, called buffer code, is for remembering
the most recent data in a data stream. Many of the codes
presented here are either optimal or asymptotically optimal. We
also present bounds to the performance of general codes. The results
show that rewriting codes can integrate a flash memory’s
rewriting capabilities for different variables to a high degree
Mutually Uncorrelated Primers for DNA-Based Data Storage
We introduce the notion of weakly mutually uncorrelated (WMU) sequences,
motivated by applications in DNA-based data storage systems and for
synchronization of communication devices. WMU sequences are characterized by
the property that no sufficiently long suffix of one sequence is the prefix of
the same or another sequence. WMU sequences used for primer design in DNA-based
data storage systems are also required to be at large mutual Hamming distance
from each other, have balanced compositions of symbols, and avoid primer-dimer
byproducts. We derive bounds on the size of WMU and various constrained WMU
codes and present a number of constructions for balanced, error-correcting,
primer-dimer free WMU codes using Dyck paths, prefix-synchronized and cyclic
codes.Comment: 14 pages, 3 figures, 1 Table. arXiv admin note: text overlap with
arXiv:1601.0817
Weighted universal image compression
We describe a general coding strategy leading to a family of universal image compression systems designed to give good performance in applications where the statistics of the source to be compressed are not available at design time or vary over time or space. The basic approach considered uses a two-stage structure in which the single source code of traditional image compression systems is replaced with a family of codes designed to cover a large class of possible sources. To illustrate this approach, we consider the optimal design and use of two-stage codes containing collections of vector quantizers (weighted universal vector quantization), bit allocations for JPEG-style coding (weighted universal bit allocation), and transform codes (weighted universal transform coding). Further, we demonstrate the benefits to be gained from the inclusion of perceptual distortion measures and optimal parsing. The strategy yields two-stage codes that significantly outperform their single-stage predecessors. On a sequence of medical images, weighted universal vector quantization outperforms entropy coded vector quantization by over 9 dB. On the same data sequence, weighted universal bit allocation outperforms a JPEG-style code by over 2.5 dB. On a collection of mixed test and image data, weighted universal transform coding outperforms a single, data-optimized transform code (which gives performance almost identical to that of JPEG) by over 6 dB
- …