3,399 research outputs found

    Maximal codeword lengths in Huffman codes

    Get PDF
    The following question about Huffman coding, which is an important technique for compressing data from a discrete source, is considered. If p is the smallest source probability, how long, in terms of p, can the longest Huffman codeword be? It is shown that if p is in the range 0 less than p less than or equal to 1/2, and if K is the unique index such that 1/F(sub K+3) less than p less than or equal to 1/F(sub K+2), where F(sub K) denotes the Kth Fibonacci number, then the longest Huffman codeword for a source whose least probability is p is at most K, and no better bound is possible. Asymptotically, this implies the surprising fact that for small values of p, a Huffman code's longest codeword can be as much as 44 percent larger than that of the corresponding Shannon code

    Real-time transmission of digital video using variable-length coding

    Get PDF
    Huffman coding is a variable-length lossless compression technique where data with a high probability of occurrence is represented with short codewords, while 'not-so-likely' data is assigned longer codewords. Compression is achieved when the high-probability levels occur so frequently that their benefit outweighs any penalty paid when a less likely input occurs. One instance where Huffman coding is extremely effective occurs when data is highly predictable and differential coding can be applied (as with a digital video signal). For that reason, it is desirable to apply this compression technique to digital video transmission; however, special care must be taken in order to implement a communication protocol utilizing Huffman coding. This paper addresses several of the issues relating to the real-time transmission of Huffman-coded digital video over a constant-rate serial channel. Topics discussed include data rate conversion (from variable to a fixed rate), efficient data buffering, channel coding, recovery from communication errors, decoder synchronization, and decoder architectures. A description of the hardware developed to execute Huffman coding and serial transmission is also included. Although this paper focuses on matters relating to Huffman-coded digital video, the techniques discussed can easily be generalized for a variety of applications which require transmission of variable-length data

    Binary tree untuk masalah huffman dan masalah yosehus

    Get PDF
    Binary Tree adalah Tree .yang mem[Dunyai syarat 1 tertentu yaitu hanya sate vertek yang mempu nyai degree; 2 sedangkan vert4 yang lain mempunyai degree 1 atau 3. 1 .! Disini yang dibicarakan adalah penggunaan Binary Tree untuk menyelesaiiian masalah 1 Huffman dan masaiah Yosephus. This: document is Unclip Institutional RepOsitory Collection. The E ,$) or copyright owners) - agree that UNUIP-Ik may, changing the content, translate the submission toi any medium or format for the Ourpose Of preservation. The author:(s) or NIflIP IR Hm•nr, rrnr,n ihrenic,in

    Analisis Perbandingan Kinerja Algoritma Kompresi LZW, Huffman dan Deflate Pada Berbagai Jenis File PERFORMANCE COMPARISON ANALYSIS OF LZW, HUFFMAN AND DEFLATE COMPRESSION ALGORITHMS ON VARIOUS FILES

    Get PDF
    ABSTRAKSI: Seiring dengan semakin berkembangnya bentuk data yang diolah melalui perangkat komputer, maka semakin besar pula ukuran dari data yang diolah tersebut. Data yang berukuran besar akan sangat menyita ruang penyimpanan yang dimiliki, serta akan sangat memakan waktu jika dipertukarkan dengan pengguna lainnya dalam jaringan komputer.Untuk itu perlu dilakukan pemampatan data atau kompresi data untuk bisa meminimalkan ukuran dari data yang akan digunakan.Pada Tugas Akhir ini dilakukan pengujian dan perbandingan kinerja tiga algoritma kompresi yaitu algoritma Huffman, algoritma LZW (Lempel-Ziv-Welch) dan algoritma Deflate. Ketiga algoritma tersebut diimplementasikan ke dalam dua perangkat lunak yang pertama adalah program pengkompresi data dan yang kedua program penampil citra dari server ke klien dan diujikan terhadap beberapa golongan kasus uji, lalu kinerjanya diukur berdasarkan kecepatan kompresi, kecepatan dekompresi serta rasio ukuran file hasil kompresi dengan file sebelum kompresi.Dari hasil pengujian, Algoritma Deflate menghasilkan rasio kompresi yang paling bagus yaitu dengan rata-rata rasio kompresi 401,2%, diikuti oleh algoritma LZW yaitu dengan rata-rata rasio 240,2% dan terakhir algoritma Huffman dengan rata-rata rasio kompresi 146,9%. Sedangkan dilihat dari segi kecepatan kompresi data, algoritma Huffman menghasilkan kecepatan rata-rata yang paling tingi yaitu dengan rata-rata kecepatan kompresi 6.560,3 Kbyte/s, diikuti oleh algoritma Deflate dengan 1.833,7 Kbyte/s dan algoritma LZW 483,6 Kbyte/s. Sedangkan untuk kecepatan dekompresi data algoritma Deflate menghasilkan kecepatan rata-rata 17.653,5 Kbyte/s kemudian algoritma Huffman 7.790,8 Kbyte/s dan algoritma LZW 622,4 Kbyte/s.Kata Kunci : ABSTRACT: As the data format that processed within computer has been growth, the bigger data size is concerned. Large size data require more space in storage, and it will take too much time when it transferred with other users in computer’s network. In spite of this, it is need to compress the data so it will minimize the size of data that used.In This final duty, will be tested and compared three compression algorithms, those are Huffman Algorithm, LZW Algorithm, and Deflate Algorithm. All of those algorithm will be implemented into two software first is compression software and the second is image viewer from server to client software, both will be tested into several group test, performances that measured are compression speed, decompression speed and compression ratio.The implementation result shows, Deflate produce the best average compression ratio with 401,2%, followed by LZW with 240,2% and Huffman with 146,9%. Base on compression speed, Huffman gain the top position with average compression speed 6.560,3 Kbyte/s, followed by Deflate with 1.833,7 Kbyte/s and LZW with 483,6%. Meanwhile, base on decompression speed, Deflate produce average decompression speed up to 17.653,5 Kbyte/s, followed by Huffman with 7.790,8 Kbyte/s and LZW with 622,4 Kbyte/s.Keyword

    A NOVEL APPROACH FOR REDUCTION OF HUFFMAN COST TABLE IN IMAGE COMPRESSION

    Get PDF
    Huffman codes are being widely used as a very efficient technique for compressing image. To achieve a high compressing ratio, the cost table need to be reduced. A new approach has been defined which reduces the cost table of the traditional Huffman Algorithm. This paper presents a minor modification to the Huffman coding of the binary Huffman compression algorithm. A study and implementation of the traditional Huffman algorithm is studied. In this paper a new methodology has been proposed for the reduction of the cost table for the image compression using Huffman coding Technique. Compared with the traditional Huffman coding the proposed method yields the best results in the generation of cost tables. The advantages of new binary Huffman table are that the space requirement and time required to transmit the image is reduced significantly

    Image Compression using an efficient hybrid algorithm

    Get PDF
    This research paper proposes a method for the compression of medical images using an efficient hybrid algorithm. The objective of this hybrid (DWT,DCT and Huffman quantization) scheme is to calculate the compression ratio, peak signal to noise ratio and mean square error by changing the DWT level and Huffman quantization factor. The goal is to achieve higher compression ratio by applying different compression thresholds for the wavelet coefficient of each DWT band and then DCT with varying Huffman quantization factor while preserving the quality of reconstructed image. First DWT and DCT is applied on individual components RGB. After applying this image is quantized using Huffman quantization to calculate probability index for each unique quantity so as to find out the unique binary code for each unique symbol for their encoding

    Spartan Daily September 27, 2012

    Get PDF
    Volume 139, Issue 16https://scholarworks.sjsu.edu/spartandaily/1332/thumbnail.jp

    Real-time demonstration hardware for enhanced DPCM video compression algorithm

    Get PDF
    The lack of available wideband digital links as well as the complexity of implementation of bandwidth efficient digital video CODECs (encoder/decoder) has worked to keep the cost of digital television transmission too high to compete with analog methods. Terrestrial and satellite video service providers, however, are now recognizing the potential gains that digital video compression offers and are proposing to incorporate compression systems to increase the number of available program channels. NASA is similarly recognizing the benefits of and trend toward digital video compression techniques for transmission of high quality video from space and therefore, has developed a digital television bandwidth compression algorithm to process standard National Television Systems Committee (NTSC) composite color television signals. The algorithm is based on differential pulse code modulation (DPCM), but additionally utilizes a non-adaptive predictor, non-uniform quantizer and multilevel Huffman coder to reduce the data rate substantially below that achievable with straight DPCM. The non-adaptive predictor and multilevel Huffman coder combine to set this technique apart from other DPCM encoding algorithms. All processing is done on a intra-field basis to prevent motion degradation and minimize hardware complexity. Computer simulations have shown the algorithm will produce broadcast quality reconstructed video at an average transmission rate of 1.8 bits/pixel. Hardware implementation of the DPCM circuit, non-adaptive predictor and non-uniform quantizer has been completed, providing realtime demonstration of the image quality at full video rates. Video sampling/reconstruction circuits have also been constructed to accomplish the analog video processing necessary for the real-time demonstration. Performance results for the completed hardware compare favorably with simulation results. Hardware implementation of the multilevel Huffman encoder/decoder is currently under development along with implementation of a buffer control algorithm to accommodate the variable data rate output of the multilevel Huffman encoder. A video CODEC of this type could be used to compress NTSC color television signals where high quality reconstruction is desirable (e.g., Space Station video transmission, transmission direct-to-the-home via direct broadcast satellite systems or cable television distribution to system headends and direct-to-the-home)
    • …
    corecore