11 research outputs found

    Cumulant Generating Function of Codeword Lengths in Variable-Length Lossy Compression Allowing Positive Excess Distortion Probability

    Full text link
    This paper considers the problem of variable-length lossy source coding. The performance criteria are the excess distortion probability and the cumulant generating function of codeword lengths. We derive a non-asymptotic fundamental limit of the cumulant generating function of codeword lengths allowing positive excess distortion probability. It is shown that the achievability and converse bounds are characterized by the R\'enyi entropy-based quantity. In the proof of the achievability result, the explicit code construction is provided. Further, we investigate an asymptotic single-letter characterization of the fundamental limit for a stationary memoryless source.Comment: arXiv admin note: text overlap with arXiv:1701.0180

    Complexity and second moment of the mathematical theory of communication

    Get PDF
    The performance of an error correcting code is evaluated by its block error probability, code rate, and encoding and decoding complexity. The performance of a series of codes is evaluated by, as the block lengths approach infinity, whether their block error probabilities decay to zero, whether their code rates converge to channel capacity, and whether their growth in complexities stays under control. Over any discrete memoryless channel, I build codes such that: for one, their block error probabilities and code rates scale like random codes’; and for two, their encoding and decoding complexities scale like polar codes’. Quantitatively, for any constants π, ρ > 0 such that π+2ρ < 1, I construct a series of error correcting codes with block length N approaching infinity, block error probability exp(−Nπ), code rate N−ρ less than the channel capacity, and encoding and decoding complexity O(N logN) per code block. Over any discrete memoryless channel, I also build codes such that: for one, they achieve channel capacity rapidly; and for two, their encoding and decoding complexities outperform all known codes over non-BEC channels. Quantitatively, for any constants τ, ρ > 0 such that 2ρ < 1, I construct a series of error correcting codes with block length N approaching infinity, block error probability exp(−(logN)τ ), code rate N−ρ less than the channel capacity, and encoding and decoding complexity O(N log(logN)) per code block. The two aforementioned results are built upon two pillars—a versatile framework that generates codes on the basis of channel polarization, and a calculus–probability machinery that evaluates the performances of codes. The framework that generates codes and the machinery that evaluates codes can be extended to many other scenarios in network information theory. To name a few: lossless compression with side information, lossy compression, Slepian–Wolf problem, Wyner–Ziv Problem, multiple access channel, wiretap channel of type I, and broadcast channel. In each scenario, the adapted notions of block error probability and code rate approach their limits at the same paces as specified above

    Journal of Telecommunications and Information Technology, 2001, nr 3

    Get PDF
    kwartalni

    Entropy in Image Analysis II

    Get PDF
    Image analysis is a fundamental task for any application where extracting information from images is required. The analysis requires highly sophisticated numerical and analytical methods, particularly for those applications in medicine, security, and other fields where the results of the processing consist of data of vital importance. This fact is evident from all the articles composing the Special Issue "Entropy in Image Analysis II", in which the authors used widely tested methods to verify their results. In the process of reading the present volume, the reader will appreciate the richness of their methods and applications, in particular for medical imaging and image security, and a remarkable cross-fertilization among the proposed research areas

    Information Theory and Machine Learning

    Get PDF
    The recent successes of machine learning, especially regarding systems based on deep neural networks, have encouraged further research activities and raised a new set of challenges in understanding and designing complex machine learning algorithms. New applications require learning algorithms to be distributed, have transferable learning results, use computation resources efficiently, convergence quickly on online settings, have performance guarantees, satisfy fairness or privacy constraints, incorporate domain knowledge on model structures, etc. A new wave of developments in statistical learning theory and information theory has set out to address these challenges. This Special Issue, "Machine Learning and Information Theory", aims to collect recent results in this direction reflecting a diverse spectrum of visions and efforts to extend conventional theories and develop analysis tools for these complex machine learning systems

    STK /WST 795 Research Reports

    Get PDF
    These documents contain the honours research reports for each year for the Department of Statistics.Honours Research Reports - University of Pretoria 20XXStatisticsBSs (Hons) Mathematical Statistics, BCom (Hons) Statistics, BCom (Hons) Mathematical StatisticsUnrestricte
    corecore