125,804 research outputs found

    Zerotree design for image compression: toward weighted universal zerotree coding

    Get PDF
    We consider the problem of optimal, data-dependent zerotree design for use in weighted universal zerotree codes for image compression. A weighted universal zerotree code (WUZC) is a data compression system that replaces the single, data-independent zerotree of Said and Pearlman (see IEEE Transactions on Circuits and Systems for Video Technology, vol.6, no.3, p.243-50, 1996) with an optimal collection of zerotrees for good image coding performance across a wide variety of possible sources. We describe the weighted universal zerotree encoding and design algorithms but focus primarily on the problem of optimal, data-dependent zerotree design. We demonstrate the performance of the proposed algorithm by comparing, at a variety of target rates, the performance of a Said-Pearlman style code using the standard zerotree to the performance of the same code using a zerotree designed with our algorithm. The comparison is made without entropy coding. The proposed zerotree design algorithm achieves, on a collection of combined text and gray-scale images, up to 4 dB performance improvement over a Said-Pearlman zerotree

    Weighted universal transform coding: universal image compression with the Karhunen-Loève transform

    Get PDF
    We introduce a two-stage universal transform code for image compression. The code combines Karhunen-Loève transform coding with weighted universal bit allocation (WUBA) in a two-stage algorithm analogous to the algorithm for weighted universal vector quantization (WUVQ). The encoder uses a collection of transform/bit allocation pairs rather than a single transform/bit allocation pair (as in JPEG) or a single transform with a variety of bit allocations (as in WUBA). We describe both an encoding algorithm for achieving optimal compression using a collection of transform/bit allocation pairs and a technique for designing locally optimal collections of transform/bit allocation pairs. We demonstrate the performance using the mean squared error distortion measure. On a sequence of combined text and gray scale images, the algorithm achieves up to a 2 dB improvement over a JPEG style coder using the discrete cosine transform (DCT) and an optimal collection of bit allocations, up to a 3 dB improvement over a JPEG style coder using the DCT and a single (optimal) bit allocation, up to 6 dB over an entropy constrained WUVQ with first- and second-stage vector dimensions equal to 16 and 4 respectively, and up to a 10 dB improvement over an entropy constrained vector quantizer (ECVQ) with a vector dimension of 4

    Universal Product Code (Barcode) and Retailing Performance of Supermarkets in Nigeria

    Get PDF
    This study examined the relationship between Universal Product Code and Retailing Performance of Supermarkets in Nigeria. The specific objectives of the study were to investigate the influence of bar code formatting and bar code encoding on retailing performance. The study adopted the cross sectional research study. The choice of this approach is because it will scientifically examine the situation on ground and empirically analyze it to get result that can be generalized on the accessible population. Based on the research questions, a research questionnaire was designed and one hundred (100) copies were distributed to the sample population which was determined by convenient sampling techniques. After data cleaning, a total of ninety two (92) copies of the distributed questionnaire were retrieved. These copies were analyzed and the hypotheses were tested using the Pearson’s Product Moment Correlation Coefficient with the aid of SPSS Version 22.0. The result revealed that; there is a significant relationship between bar code formatting and sales volume, bar code formatting and profitability, bar code encoding and sales volume, bar code encoding and profitability. The study recommends that Supermarkets should endeavour to update regularly, their products barcode fields to ensure that their wide range of products are always captured and enlisted in the field to ease up the purchase and sales activities of the supermarkets; Regular training exercise should be organized to educate the staff of the supermarkets on how to manage the barcode fields and to efficiently handle issues relating to formatting and encoding of the barcode fields. Keywords: Universal, Product, Code, Retailing, Performance, Barcode Formatting, Barcode Encoding, Sales Volume and Profitability. DOI: 10.7176/EJBM/12-29-03 Publication date:October 31st 202

    Encoding the â„“_p ball from limited measurements

    Get PDF
    We address the problem of encoding signals which are sparse, i.e. signals that are concentrated on a set of small support. Mathematically, such signals are modeled as elements in the ℓ_p ball for some p ≤ 1. We describe a strategy for encoding elements of the ℓ_p ball which is universal in that 1) the encoding procedure is completely generic, and does not depend on p (the sparsity of the signal), and 2) it achieves near-optimal minimax performance simultaneously for all p < 1. What makes our coding procedure unique is that it requires only a limited number of nonadaptive measurements of the underlying sparse signal; we show that near-optimal performance can be obtained with a number of measurements that is roughly proportional to the number of bits used by the encoder. We end by briefly discussing these results in the context of image compression

    Weighted universal bit allocation: optimal multiple quantization matrix coding

    Get PDF
    We introduce a two-stage bit allocation algorithm analogous to the algorithm for weighted universal vector quantization (WUVQ). The encoder uses a collection of possible bit allocations (typically in the form of a collection of quantization matrices) rather than a single bit allocation (or single quantization matrix). We describe both an encoding algorithm for achieving optimal compression using a collection of bit allocations and a technique for designing locally optimal collections of bit allocations. We demonstrate performance on a JPEG style coder using the mean squared error (MSE) distortion measure. On a sequence of medical brain scans, the algorithm achieves up to 2.5 dB improvement over a single bit allocation system, up to 5 dB improvement over a WUVQ with first- and second-stage vector dimensions equal to 16 and 4 respectively, and up to 12 dB improvement over an entropy constrained vector quantizer (ECVQ) using 4 dimensional vectors

    Adversarial Network Bottleneck Features for Noise Robust Speaker Verification

    Full text link
    In this paper, we propose a noise robust bottleneck feature representation which is generated by an adversarial network (AN). The AN includes two cascade connected networks, an encoding network (EN) and a discriminative network (DN). Mel-frequency cepstral coefficients (MFCCs) of clean and noisy speech are used as input to the EN and the output of the EN is used as the noise robust feature. The EN and DN are trained in turn, namely, when training the DN, noise types are selected as the training labels and when training the EN, all labels are set as the same, i.e., the clean speech label, which aims to make the AN features invariant to noise and thus achieve noise robustness. We evaluate the performance of the proposed feature on a Gaussian Mixture Model-Universal Background Model based speaker verification system, and make comparison to MFCC features of speech enhanced by short-time spectral amplitude minimum mean square error (STSA-MMSE) and deep neural network-based speech enhancement (DNN-SE) methods. Experimental results on the RSR2015 database show that the proposed AN bottleneck feature (AN-BN) dramatically outperforms the STSA-MMSE and DNN-SE based MFCCs for different noise types and signal-to-noise ratios. Furthermore, the AN-BN feature is able to improve the speaker verification performance under the clean condition
    • …
    corecore