4,325 research outputs found

    Information Storage in the Stochastic Ising Model

    Full text link
    Most information systems store data by modifying the local state of matter, in the hope that atomic (or sub-atomic) local interactions would stabilize the state for a sufficiently long time, thereby allowing later recovery. In this work we initiate the study of information retention in locally-interacting systems. The evolution in time of the interacting particles is modeled via the stochastic Ising model (SIM). The initial spin configuration X0X_0 serves as the user-controlled input. The output configuration XtX_t is produced by running tt steps of the Glauber chain. Our main goal is to evaluate the information capacity In(t)maxpX0I(X0;Xt)I_n(t)\triangleq\max_{p_{X_0}}I(X_0;X_t) when the time tt scales with the size of the system nn. For the zero-temperature SIM on the two-dimensional n×n\sqrt{n}\times\sqrt{n} grid and free boundary conditions, it is easy to show that In(t)=Θ(n)I_n(t) = \Theta(n) for t=O(n)t=O(n). In addition, we show that on the order of n\sqrt{n} bits can be stored for infinite time in striped configurations. The n\sqrt{n} achievability is optimal when tt\to\infty and nn is fixed. One of the main results of this work is an achievability scheme that stores more than n\sqrt{n} bits (in orders of magnitude) for superlinear (in nn) times. The analysis of the scheme decomposes the system into Ω(n)\Omega(\sqrt{n}) independent Z-channels whose crossover probability is found via the (recently rigorously established) Lifshitz law of phase boundary movement. We also provide results for the positive but small temperature regime. We show that an initial configuration drawn according to the Gibbs measure cannot retain more than a single bit for tecn14+ϵt\geq e^{cn^{\frac{1}{4}+\epsilon}}. On the other hand, when scaling time with β\beta, the stripe-based coding scheme (that stores for infinite time at zero temperature) is shown to retain its bits for time that is exponential in β\beta

    Sparse cross-products of metadata in scientific simulation management

    Get PDF
    Managing scientific data is by no means a trivial task even in a single site environment with a small number of researchers involved. We discuss some issues concerned with posing well-specified experiments in terms of parameters or instrument settings and the metadata framework that arises from doing so. We are particularly interested in parallel computer simulation experiments, where very large quantities of warehouse-able data are involved. We consider SQL databases and other framework technologies for manipulating experimental data. Our framework manages the the outputs from parallel runs that arise from large cross-products of parameter combinations. Considerable useful experiment planning and analysis can be done with the sparse metadata without fully expanding the parameter cross-products. Extra value can be obtained from simulation output that can subsequently be data-mined. We have particular interests in running large scale Monte-Carlo physics model simulations. Finding ourselves overwhelmed by the problems of managing data and compute ¿resources, we have built a prototype tool using Java and MySQL that addresses these issues. We use this example to discuss type-space management and other fundamental ideas for implementing a laboratory information management system

    Retrieval behavior and thermodynamic properties of symmetrically diluted Q-Ising neural networks

    Get PDF
    The retrieval behavior and thermodynamic properties of symmetrically diluted Q-Ising neural networks are derived and studied in replica-symmetric mean-field theory generalizing earlier works on either the fully connected or the symmetrical extremely diluted network. Capacity-gain parameter phase diagrams are obtained for the Q=3, Q=4 and Q=Q=\infty state networks with uniformly distributed patterns of low activity in order to search for the effects of a gradual dilution of the synapses. It is shown that enlarged regions of continuous changeover into a region of optimal performance are obtained for finite stochastic noise and small but finite connectivity. The de Almeida-Thouless lines of stability are obtained for arbitrary connectivity, and the resulting phase diagrams are used to draw conclusions on the behavior of symmetrically diluted networks with other pattern distributions of either high or low activity.Comment: 21 pages, revte

    Neural Networks retrieving Boolean patterns in a sea of Gaussian ones

    Full text link
    Restricted Boltzmann Machines are key tools in Machine Learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient descent like methods, while Hopfield networks typically store binary patterns to be able to retrieve, the investigation of a mixed Hebbian network, equipped with both real (e.g., Gaussian) and discrete (e.g., Boolean) patterns naturally arises. We prove that, in the challenging regime of a high storage of real patterns, where retrieval is forbidden, an extra load of Boolean patterns can still be retrieved, as long as the ratio among the overall load and the network size does not exceed a critical threshold, that turns out to be the same of the standard Amit-Gutfreund-Sompolinsky theory. Assuming replica symmetry, we study the case of a low load of Boolean patterns combining the stochastic stability and Hamilton-Jacobi interpolating techniques. The result can be extended to the high load by a non rigorous but standard replica computation argument.Comment: 16 pages, 1 figur

    Thouless-Anderson-Palmer equation for analog neural network with temporally fluctuating white synaptic noise

    Full text link
    Effects of synaptic noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analog neural networks with temporally fluctuating synaptic noise, which is assumed to be white noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic network requires the knowledge of the Thouless-Anderson-Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very few, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with synaptic noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.Comment: 13 pages, 3 figure

    How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation

    Full text link
    The feasibility of computationally superior quantum computers is one of the most exciting and clear-cut scientific questions of our time. The question touches on fundamental issues regarding probability, physics, and computability, as well as on exciting problems in experimental physics, engineering, computer science, and mathematics. We propose three related directions towards a negative answer. The first is a conjecture about physical realizations of quantum codes, the second has to do with correlations in stochastic physical systems, and the third proposes a model for quantum evolutions when noise accumulates. The paper is dedicated to the memory of Itamar Pitowsky.Comment: 16 page
    corecore