363 research outputs found

    Quantifying hidden order out of equilibrium

    Full text link
    While the equilibrium properties, states, and phase transitions of interacting systems are well described by statistical mechanics, the lack of suitable state parameters has hindered the understanding of non-equilibrium phenomena in diverse settings, from glasses to driven systems to biology. The length of a losslessly compressed data file is a direct measure of its information content: The more ordered the data is, the lower its information content and the shorter the length of its encoding can be made. Here, we describe how data compression enables the quantification of order in non-equilibrium and equilibrium many-body systems, both discrete and continuous, even when the underlying form of order is unknown. We consider absorbing state models on and off-lattice, as well as a system of active Brownian particles undergoing motility-induced phase separation. The technique reliably identifies non-equilibrium phase transitions, determines their character, quantitatively predicts certain critical exponents without prior knowledge of the order parameters, and reveals previously unknown ordering phenomena. This technique should provide a quantitative measure of organization in condensed matter and other systems exhibiting collective phase transitions in and out of equilibrium

    Operator/System Communication : An Optimizing Decision Tool

    Get PDF
    Copyright 1990 Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.In this paper we address the problem of operator/system communication. In particular, we discuss the issue of efficient and adaptive transmission mechanisms over possible physical links. We develop a tool for making decisions regarding the flow of control sequences and data from and to the operator. The issue of compression is discussed in details, a decision box and an optimizing tool for finding the appropriate thresholds for a decision are developed. Physical parameters like the data rate, bandwidth of the communication medium, distance between the operator and the system, baud rate, levels of discretization, signal to noise ratio and propagation speed of the signal are taken into consideration while developing our decision system. Theoretical analysis is performed to develop mathematical models for the optimization algorithm. Simulation models are also developed for testing both the optimization and the decision tool box.http://dx.doi.org/10.1117/12.2549

    A Comparison of Compressed and Uncompressed Transmission Modes

    Get PDF
    In this paper we address the problem of host to host communication. In particular, we discuss the issue of efficient and adaptive transmission mechanisms over possible physical links. We develop a tool for making decisions regarding the flow of control sequences and data from and to a host. The issue of compression is discussed in details, a decision box and an optimizing tool for finding the appropriate thresholds for a decision are developed. Physical parameters like the data rate, bandwidth of the communication medium, distance between the hosts, baud rate, levels of discretization, signal to noise ratio and propagation speed of the signal are taken into consideration while developing our decision system. Theoretical analysis is performed to develop mathematical models for the optimization algorithm. Simulation models are also developed for testing both the optimization and the decision tool box

    Opportunistic Source Coding for Data Gathering in Wireless Sensor Networks

    Get PDF
    We propose a jointly opportunistic source coding and opportunistic routing (OSCOR) protocol for correlated data gathering in wireless sensor networks. OSCOR improves data gathering efficiency by exploiting opportunistic data compression and cooperative diversity associated with wireless broadcast advantage. The design of OSCOR involves several challenging issues across different network protocol layers. At the MAC layer, sensor nodes need to coordinate wireless transmission and packet forwarding to exploit multiuser diversity in packet reception. At the network layer, in order to achieve high diversity and compression gains, routing must be based on a metric that is dependent on not only link-quality but also compression opportunities. At the application layer, sensor nodes need a distributed source coding algorithm that has low coordination overhead and does not require the source distributions to be known. OSCOR provides practical solutions to these challenges incorporating a slightly modified 802.11 MAC, a distributed source coding scheme based on network coding and Lempel-Ziv coding, and a node compression ratio dependent metric combined with a modified Dijkstra's algorithm for path selection. We evaluate the performance of OSCOR through simulations, and show that OSCOR can potentially reduce power consumption by over 30% compared with an existing greedy scheme, routing driven compression, in a 4 x 4 grid network

    On Prediction Using Variable Order Markov Models

    Full text link
    This paper is concerned with algorithms for prediction of discrete sequences over a finite alphabet, using variable order Markov models. The class of such algorithms is large and in principle includes any lossless compression algorithm. We focus on six prominent prediction algorithms, including Context Tree Weighting (CTW), Prediction by Partial Match (PPM) and Probabilistic Suffix Trees (PSTs). We discuss the properties of these algorithms and compare their performance using real life sequences from three domains: proteins, English text and music pieces. The comparison is made with respect to prediction quality as measured by the average log-loss. We also compare classification algorithms based on these predictors with respect to a number of large protein classification tasks. Our results indicate that a "decomposed" CTW (a variant of the CTW algorithm) and PPM outperform all other algorithms in sequence prediction tasks. Somewhat surprisingly, a different algorithm, which is a modification of the Lempel-Ziv compression algorithm, significantly outperforms all algorithms on the protein classification problems

    Optimum Implementation of Compound Compression of a Computer Screen for Real-Time Transmission in Low Network Bandwidth Environments

    Get PDF
    Remote working is becoming increasingly more prevalent in recent times. A large part of remote working involves sharing computer screens between servers and clients. The image content that is presented when sharing computer screens consists of both natural camera captured image data as well as computer generated graphics and text. The attributes of natural camera captured image data differ greatly to the attributes of computer generated image data. An image containing a mixture of both natural camera captured image and computer generated image data is known as a compound image. The research presented in this thesis focuses on the challenge of constructing a compound compression strategy to apply the ‘best fit’ compression algorithm for the mixed content found in a compound image. The research also involves analysis and classification of the types of data a given compound image may contain. While researching optimal types of compression, consideration is given to the computational overhead of a given algorithm because the research is being developed for real time systems such as cloud computing services, where latency has a detrimental impact on end user experience. The previous and current state of the art videos codec’s have been researched along many of the most current publishing’s from academia, to design and implement a novel approach to a low complexity compound compression algorithm that will be suitable for real time transmission. The compound compression algorithm will utilise a mixture of lossless and lossy compression algorithms with parameters that can be used to control the performance of the algorithm. An objective image quality assessment is needed to determine whether the proposed algorithm can produce an acceptable quality image after processing. Both traditional metrics such as Peak Signal to Noise Ratio will be used along with a new more modern approach specifically designed for compound images which is known as Structural Similarity Index will be used to define the quality of the decompressed Image. In finishing, the compression strategy will be tested on a set of generated compound images. Using open source software, the same images will be compressed with the previous and current state of the art video codec’s to compare the three main metrics, compression ratio, computational complexity and objective image quality

    Automated Visual Basic Application for Zipping and Backup

    Get PDF
    This study focuses on the visual basic zipping and backup application to improve and develop the way of life to a more convenient style. This approach can be applied to various areas such as different working platform or environment to develop a more secure project. The main target is to design an application to interface between the VBPZip with the visual basic project files working on the Microsoft Visual Basic 6.0 and Microsoft Access 2002 terminal with data compression and backup functionality. The dictionary data compression technique and LZW method used to zip the project files before backups. It will act as a synergy between user and application with the server that complies with the overall application. The VBPZip application development involves several stages. Defining the methodology, there will be four phrases, which are analyzing, designing, coding and testing. The cores of the project are the data compression and backup functionality in the VBPZip source code, database development and its interface design. Data structure of the application is obtained through research and detailed assessment. Entirely the back end of the application is concerning the source code and its interface. To achieve those with fine results, there are tool required during the whole process. Thus, the result will be concluded based on the objective set. The application comprises of several form which will act as the interface between the user and database. The form encompass the VBZipping, File Type Options, Auto Zipping, date and time setting and related project files data. All of them have the same purpose which is to ease and create simplicity for the current visual basic platform applied at most development areas. Suggested works for further enhancement and realization are also stated. v
    corecore