29 research outputs found

    Network Inference with Hidden Units

    Full text link
    We derive learning rules for finding the connections between units in stochastic dynamical networks from the recorded history of a ``visible'' subset of the units. We consider two models. In both of them, the visible units are binary and stochastic. In one model the ``hidden'' units are continuous-valued, with sigmoidal activation functions, and in the other they are binary and stochastic like the visible ones. We derive exact learning rules for both cases. For the stochastic case, performing the exact calculation requires, in general, repeated summations over an number of configurations that grows exponentially with the size of the system and the data length, which is not feasible for large systems. We derive a mean field theory, based on a factorized ansatz for the distribution of hidden-unit states, which offers an attractive alternative for large systems. We present the results of some numerical calculations that illustrate key features of the two models and, for the stochastic case, the exact and approximate calculations

    Cell cycle progression

    Get PDF

    Stochastic activation in a genetic switch model

    Full text link
    We study a biological autoregulation process, involving a protein that enhances its own transcription, in a parameter region where bistability would be present in the absence of fluctuations. We calculate the rate of fluctuation-induced rare transitions between locally-stable states using a path integral formulation and Master and Chapman-Kolmogorov equations. As in simpler models for rare transitions, the rate has the form of the exponential of a quantity S0S_0 (a "barrier") multiplied by a prefactor η\eta. We calculate S0S_0 and η\eta first in the bursting limit (where the ratio γ\gamma of the protein and mRNA lifetimes is very large). In this limit, the calculation can be done almost entirely analytically, and the results are in good agreement with simulations. For finite γ\gamma numerical calculations are generally required. However, S0S_0 can be calculated analytically to first order in 1/γ1/\gamma, and the result agrees well with the full numerical calculation for all γ>1\gamma > 1. Employing a method used previously on other problems, we find we can account qualitatively for the way the prefactor η\eta varies with γ\gamma, but its value is 15-20% higher than that inferred from simulations.Comment: 26 pages, 9 figures; revised version: corrected a few typos, added a little new text at the beginning and the end, made small changes in some figures and caption

    The Effect of Nonstationarity on Models Inferred from Neural Data

    Full text link
    Neurons subject to a common non-stationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished, with machine learning techniques, provided the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to two data sets: one from salamander retinal ganglion cells and the other from a realistic computational cortical network model. We show that many aspects of the concerted activity of the salamander retinal neurons can be traced simply to the external input. A model of non-interacting neurons subject to a non-stationary external field outperforms a model with stationary input with couplings between neurons, even accounting for the differences in the number of model parameters. When couplings are added to the non-stationary model, for the retinal data, little is gained: the inferred couplings are generally not significant. Likewise, the distribution of the sizes of sets of neurons that spike simultaneously and the frequency of spike patterns as function of their rank (Zipf plots) are well-explained by an independent-neuron model with time-dependent external input, and adding connections to such a model does not offer significant improvement. For the cortical model data, robust couplings, well correlated with the real connections, can be inferred using the non-stationary model. Adding connections to this model slightly improves the agreement with the data for the probability of synchronous spikes but hardly affects the Zipf plot.Comment: version in press in J Stat Mec

    Mixture models for analysis of melting temperature data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In addition to their use in detecting undesired real-time PCR products, melting temperatures are useful for detecting variations in the desired target sequences. Methodological improvements in recent years allow the generation of high-resolution melting-temperature (T<sub>m</sub>) data. However, there is currently no convention on how to statistically analyze such high-resolution T<sub>m </sub>data.</p> <p>Results</p> <p>Mixture model analysis was applied to T<sub>m </sub>data. Models were selected based on Akaike's information criterion. Mixture model analysis correctly identified categories in T<sub>m </sub>data obtained for known plasmid targets. Using simulated data, we investigated the number of observations required for model construction. The precision of the reported mixing proportions from data fitted to a preconstructed model was also evaluated.</p> <p>Conclusion</p> <p>Mixture model analysis of T<sub>m </sub>data allows the minimum number of different sequences in a set of amplicons and their relative frequencies to be determined. This approach allows T<sub>m </sub>data to be analyzed, classified, and compared in an unbiased manner.</p
    corecore