532 research outputs found

    Lossy data compression with random gates

    Full text link
    We introduce a new protocol for a lossy data compression algorithm which is based on constraint satisfaction gates. We show that the theoretical capacity of algorithms built from standard parity-check gates converges exponentially fast to the Shannon's bound when the number of variables seen by each gate increases. We then generalize this approach by introducing random gates. They have theoretical performances nearly as good as parity checks, but they offer the great advantage that the encoding can be done in linear time using the Survey Inspired Decimation algorithm, a powerful algorithm for constraint satisfaction problems derived from statistical physics

    Anderson Localization in Euclidean Random Matrices

    Get PDF
    We study spectra and localization properties of Euclidean random matrices. The problem is approximately mapped onto that of a matrix defined on a random graph. We introduce a powerful method to find the density of states and the localization threshold. We solve numerically an exact equation for the probability distribution function of the diagonal element of the the resolvent matrix, with a population dynamics algorithm, and we show how this can be used to find the localization threshold. An application of the method in the context of the Instantaneous Normal Modes of a liquid system is given.Comment: 4 page

    On the origin of the Boson peak in globular proteins

    Full text link
    We study the Boson Peak phenomenology experimentally observed in globular proteins by means of elastic network models. These models are suitable for an analytic treatment in the framework of Euclidean Random Matrix theory, whose predictions can be numerically tested on real proteins structures. We find that the emergence of the Boson Peak is strictly related to an intrinsic mechanical instability of the protein, in close similarity to what is thought to happen in glasses. The biological implications of this conclusion are also discussed by focusing on a representative case study.Comment: Proceedings of the X International Workshop on Disordered Systems, Molveno (2006

    Analysis of CDMA systems that are characterized by eigenvalue spectrum

    Full text link
    An approach by which to analyze the performance of the code division multiple access (CDMA) scheme, which is a core technology used in modern wireless communication systems, is provided. The approach characterizes the objective system by the eigenvalue spectrum of a cross-correlation matrix composed of signature sequences used in CDMA communication, which enables us to handle a wider class of CDMA systems beyond the basic model reported by Tanaka. The utility of the novel scheme is shown by analyzing a system in which the generation of signature sequences is designed for enhancing the orthogonality.Comment: 7 pages, 2 figure

    The theoretical capacity of the Parity Source Coder

    Full text link
    The Parity Source Coder is a protocol for data compression which is based on a set of parity checks organized in a sparse random network. We consider here the case of memoryless unbiased binary sources. We show that the theoretical capacity saturate the Shannon limit at large K. We also find that the first corrections to the leading behavior are exponentially small, so that the behavior at finite K is very close to the optimal one.Comment: Added references, minor change

    Modeling the scaling properties of human mobility

    Full text link
    While the fat tailed jump size and the waiting time distributions characterizing individual human trajectories strongly suggest the relevance of the continuous time random walk (CTRW) models of human mobility, no one seriously believes that human traces are truly random. Given the importance of human mobility, from epidemic modeling to traffic prediction and urban planning, we need quantitative models that can account for the statistical characteristics of individual human trajectories. Here we use empirical data on human mobility, captured by mobile phone traces, to show that the predictions of the CTRW models are in systematic conflict with the empirical results. We introduce two principles that govern human trajectories, allowing us to build a statistically self-consistent microscopic model for individual human mobility. The model not only accounts for the empirically observed scaling laws but also allows us to analytically predict most of the pertinent scaling exponents

    A Fuzzy Criticality Assessment System of Process Equipment for Optimized Maintenance Management.

    Get PDF
    yesIn modern chemical plants, it is essential to establish an effective maintenance strategy which will deliver financially driven results at optimised conditions, that is, minimum cost and time, by means of a criticality review of equipment in maintenance. In this article, a fuzzy logic-based criticality assessment system (FCAS) for the management of a local company’s equipment maintenance is introduced. This fuzzy system is shown to improve the conventional crisp criticality assessment system (CCAS). Results from case studies show that not only can the fuzzy logic-based system do what the conventional crisp system does but also it can output more criticality classifications with an improved reliability and a greater number of different ratings that account for fuzziness and individual voice of the decision-makers

    Fast photon detection for the COMPASS RICH detector

    Get PDF
    The COMPASS experiment at the SPS accelerator at CERN uses a large scale Ring Imaging CHerenkov detector (RICH) to identify pions, kaons and protons in a wide momentum range. For the data taking in 2006, the COMPASS RICH has been upgraded in the central photon detection area (25% of the surface) with a new technology to detect Cherenkov photons at very high count rates of several 10^6 per second and channel and a new dead-time free read-out system, which allows trigger rates up to 100 kHz. The Cherenkov photons are detected by an array of 576 visible and ultra-violet sensitive multi-anode photomultipliers with 16 channels each. The upgraded detector showed an excellent performance during the 2006 data taking.Comment: Proceeding of the IPRD06 conference (Siena, Okt. 06

    The Fast Read-out System for the MAPMTs of COMPASS RICH-1

    Full text link
    A fast readout system for the upgrade of the COMPASS RICH detector has been developed and successfully used for data taking in 2006 and 2007. The new readout system for the multi-anode PMTs in the central part of the photon detector of the RICH is based on the high-sensitivity MAD4 preamplifier-discriminator and the dead-time free F1-TDC chip characterized by high-resolution. The readout electronics has been designed taking into account the high photon flux in the central part of the detector and the requirement to run at high trigger rates of up to 100 kHz with negligible dead-time. The system is designed as a very compact setup and is mounted directly behind the multi-anode photomultipliers. The data are digitized on the frontend boards and transferred via optical links to the readout system. The read-out electronics system is described in detail together with its measured performances.Comment: Proceeding of RICH2007 Conference, Trieste, Oct. 2007. v2: minor change
    corecore