805 research outputs found

    Translating the EAH Data Compression Algorithm into Automata Theory

    Full text link
    Adaptive codes have been introduced in [Dragos Trinca, cs.DS/0505007] as a new class of non-standard variable-length codes. These codes associate variable-length codewords to symbols being encoded depending on the previous symbols in the input data string. A new data compression algorithm, called EAH, has been introduced in [Dragos Trinca, cs.DS/0505061], where we have behaviorally shown that for a large class of input data strings, this algorithm substantially outperforms the well-known Lempel-Ziv universal data compression algorithm. In this paper, we translate the EAH encoder into automata theory.Comment: 9 page

    Special Cases of Encodings by Generalized Adaptive Codes

    Full text link
    Adaptive (variable-length) codes associate variable-length codewords to symbols being encoded depending on the previous symbols in the input data string. This class of codes has been presented in [Dragos Trinca, cs.DS/0505007] as a new class of non-standard variable-length codes. Generalized adaptive codes (GA codes, for short) have been also presented in [Dragos Trinca, cs.DS/0505007] not only as a new class of non-standard variable-length codes, but also as a natural generalization of adaptive codes of any order. This paper is intended to continue developing the theory of variable-length codes by establishing several interesting connections between adaptive codes and other classes of codes. The connections are discussed not only from a theoretical point of view (by proving new results), but also from an applicative one (by proposing several applications). First, we prove that adaptive Huffman encodings and Lempel-Ziv encodings are particular cases of encodings by GA codes. Second, we show that any (n,1,m) convolutional code satisfying certain conditions can be modelled as an adaptive code of order m. Third, we describe a cryptographic scheme based on the connection between adaptive codes and convolutional codes, and present an insightful analysis of this scheme. Finally, we conclude by generalizing adaptive codes to (p,q)-adaptive codes, and discussing connections between adaptive codes and time-varying codes.Comment: 17 page

    Modelling the Eulerian Path Problem using a String Matching Framework

    Full text link
    The well-known Eulerian path problem can be solved in polynomial time (more exactly, there exists a linear time algorithm for this problem). In this paper, we model the problem using a string matching framework, and then initiate an algorithmic study on a variant of this problem, called the (2,1)-STRING-MATCH problem (which is actually a generalization of the Eulerian path problem). Then, we present a polynomial-time algorithm for the (2,1)-STRING-MATCH problem, which is the most important result of this paper. Specifically, we get a lower bound of Omega(n), and an upper bound of O(n^{2}).Comment: 10 page

    Adaptive Codes: A New Class of Non-standard Variable-length Codes

    Full text link
    We introduce a new class of non-standard variable-length codes, called adaptive codes. This class of codes associates a variable-length codeword to the symbol being encoded depending on the previous symbols in the input data string. An efficient algorithm for constructing adaptive codes of order one is presented. Then, we introduce a natural generalization of adaptive codes, called GA codes.Comment: 10 page

    High-performance BWT-based Encoders

    Full text link
    In 1994, Burrows and Wheeler developed a data compression algorithm which performs significantly better than Lempel-Ziv based algorithms. Since then, a lot of work has been done in order to improve their algorithm, which is based on a reversible transformation of the input string, called BWT (the Burrows-Wheeler transformation). In this paper, we propose a compression scheme based on BWT, MTF (move-to-front coding), and a version of the algorithms presented in [Dragos Trinca, ITCC-2004].Comment: 12 page

    EAH: A New Encoder based on Adaptive Variable-length Codes

    Full text link
    Adaptive variable-length codes associate a variable-length codeword to the symbol being encoded depending on the previous symbols in the input string. This class of codes has been recently presented in [Dragos Trinca, arXiv:cs.DS/0505007] as a new class of non-standard variable-length codes. New algorithms for data compression, based on adaptive variable-length codes of order one and Huffman's algorithm, have been recently presented in [Dragos Trinca, ITCC 2004]. In this paper, we extend the work done so far by the following contributions: first, we propose an improved generalization of these algorithms, called EAHn. Second, we compute the entropy bounds for EAHn, using the well-known bounds for Huffman's algorithm. Third, we discuss implementation details and give reports of experimental results obtained on some well-known corpora. Finally, we describe a parallel version of EAHn using the PRAM model of computation.Comment: 16 page

    Modelling the EAH Data Compression Algorithm using Graph Theory

    Full text link
    Adaptive codes associate variable-length codewords to symbols being encoded depending on the previous symbols in the input data string. This class of codes has been introduced in [Dragos Trinca, cs.DS/0505007] as a new class of non-standard variable-length codes. New algorithms for data compression, based on adaptive codes of order one, have been presented in [Dragos Trinca, ITCC-2004], where we have behaviorally shown that for a large class of input data strings, these algorithms substantially outperform the Lempel-Ziv universal data compression algorithm. EAH has been introduced in [Dragos Trinca, cs.DS/0505061], as an improved generalization of these algorithms. In this paper, we present a translation of the EAH algorithm into the graph theory.Comment: 10 page

    Randomized Iterative Reconstruction for Sparse View X-ray Computed Tomography

    Full text link
    With the availability of more powerful computers, iterative reconstruction algorithms are the subject of an ongoing work in the design of more efficient reconstruction algorithms for X-ray computed tomography. In this work, we show how two analytical reconstruction algorithms can be improved by correcting the corresponding reconstructions using a randomized iterative reconstruction algorithm. The combined analytical reconstruction followed by randomized iterative reconstruction can also be viewed as a reconstruction algorithm which, in the experiments we have conducted, uses up to 35%35\% less projection angles as compared to the analytical reconstruction algorithms and produces the same results in terms of quality of reconstruction, without increasing the execution time significantly.Comment: 23 page

    IRXCT: Iterative Reconstruction and visualization application for X-ray Computed Tomography

    Full text link
    This report describes the IRXCT Windows application for reconstruction and visualization of tomography tasks.Comment: 12 page

    Comparison of Sinogram-based Iterative Reconstruction with Compressed Sensing Techniques in X-ray CT

    Full text link
    Performing X-ray computed tomography (CT) examinations with less radiation has recently received increasing interest: in medical imaging this means less (potentially harmful) radiation for the patient; in non-destructive testing of materials/objects such as testing jet engines, the redution of the number of projection angles (which for large objects is in general high) leads to a substantial decreasing of the experiment time. In the experiment, less radiation is usually achieved by either (1) reducing the radiation dose used at each projection angle or (2) using sparse view X-ray CT, which means significantly less projection angles are used during the examination. In this work, we study the performance of the recently proposed sinogram-based iterative reconstruction algorithm in sparse view X-ray CT and show that it provides, in some cases, reconstruction accuracy better than that obtained by some of the Total Variation regularization techniques. The provided accuracy is obtained with computation times comparable to other techniques. An important feature of the sinogram-based iterative reconstruction algorithm is that it has no parameters to be set.Comment: 18 page
    • …
    corecore