333 research outputs found

    Learning and predicting time series by neural networks

    Full text link
    Artificial neural networks which are trained on a time series are supposed to achieve two abilities: firstly to predict the series many time steps ahead and secondly to learn the rule which has produced the series. It is shown that prediction and learning are not necessarily related to each other. Chaotic sequences can be learned but not predicted while quasiperiodic sequences can be well predicted but not learned.Comment: 5 page

    Generation of unpredictable time series by a Neural Network

    Full text link
    A perceptron that learns the opposite of its own output is used to generate a time series. We analyse properties of the weight vector and the generated sequence, like the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model are discussed. If a continuous transfer function is used, the system displays chaotic and intermittent behaviour, with the product of the learning rate and amplification as a control parameter.Comment: 11 pages, 14 figures; slightly expanded and clarified, mistakes corrected; accepted for publication in PR

    Correlations between hidden units in multilayer neural networks and replica symmetry breaking

    Full text link
    We consider feed-forward neural networks with one hidden layer, tree architecture and a fixed hidden-to-output Boolean function. Focusing on the saturation limit of the storage problem the influence of replica symmetry breaking on the distribution of local fields at the hidden units is investigated. These field distributions determine the probability for finding a specific activation pattern of the hidden units as well as the corresponding correlation coefficients and therefore quantify the division of labor among the hidden units. We find that although modifying the storage capacity and the distribution of local fields markedly replica symmetry breaking has only a minor effect on the correlation coefficients. Detailed numerical results are provided for the PARITY, COMMITTEE and AND machines with K=3 hidden units and nonoverlapping receptive fields.Comment: 9 pages, 3 figures, RevTex, accepted for publication in Phys. Rev.

    Storage capacity of correlated perceptrons

    Full text link
    We consider an ensemble of KK single-layer perceptrons exposed to random inputs and investigate the conditions under which the couplings of these perceptrons can be chosen such that prescribed correlations between the outputs occur. A general formalism is introduced using a multi-perceptron costfunction that allows to determine the maximal number of random inputs as a function of the desired values of the correlations. Replica-symmetric results for K=2K=2 and K=3K=3 are compared with properties of two-layer networks of tree-structure and fixed Boolean function between hidden units and output. The results show which correlations in the hidden layer of multi-layer neural networks are crucial for the value of the storage capacity.Comment: 16 pages, Latex2

    Finite size scaling in neural networks

    Full text link
    We demonstrate that the fraction of pattern sets that can be stored in single- and hidden-layer perceptrons exhibits finite size scaling. This feature allows to estimate the critical storage capacity \alpha_c from simulations of relatively small systems. We illustrate this approach by determining \alpha_c, together with the finite size scaling exponent \nu, for storing Gaussian patterns in committee and parity machines with binary couplings and up to K=5 hidden units.Comment: 4 pages, RevTex, 5 figures, uses multicol.sty and psfig.st

    Multilayer neural networks with extensively many hidden units

    Full text link
    The information processing abilities of a multilayer neural network with a number of hidden units scaling as the input dimension are studied using statistical mechanics methods. The mapping from the input layer to the hidden units is performed by general symmetric Boolean functions whereas the hidden layer is connected to the output by either discrete or continuous couplings. Introducing an overlap in the space of Boolean functions as order parameter the storage capacity if found to scale with the logarithm of the number of implementable Boolean functions. The generalization behaviour is smooth for continuous couplings and shows a discontinuous transition to perfect generalization for discrete ones.Comment: 4 pages, 2 figure

    C/EBP , C/EBP  Oncoproteins, or C/EBP  Preferentially Bind NF- B p50 Compared with p65, Focusing Therapeutic Targeting on the C/EBP:p50 Interaction

    Get PDF
    Canonical NF-κB activation signals stimulate nuclear translocation of p50:p65, replacing inhibitory p50:p50 with activating complexes on chromatin. C/EBP interaction with p50 homodimers provides an alternative pathway for NF-κB target gene activation, and interaction with p50:p65 may enhance gene activation. We previously found that C/EBPα cooperates with p50 but not p65 to induce Bcl-2 transcription and that C/EBPα induces Nfkb1/p50 but not RelA/p65 transcription. Using p50 and p65 variants containing the FLAG epitope at their N- or C-termini, we now demonstrate that C/EBPα, C/EBPα myeloid oncoproteins, or the LAP1, LAP2, or LIP isoforms of C/EBPβ have markedly higher affinity for p50 in comparison to p65. Deletion of the p65 trans-activation domain did not increase p65 affinity for C/EBPs, suggesting that unique residues in p50 account for specificity, and clustered mutation of HSDL in the “p50 insert” lacking in p65 weakens interaction. Also, in contrast to Nfkb1 gene deletion, absence of the RelA gene does not reduce Bcl-2 or Cebpa RNA in unstimulated cells or prevent interaction of C/EBPα with the Bcl-2 promoter. Saturating mutagenesis of the C/EBPα basic region identifies R300 and nearby residues, identical in C/EBPβ, as critical for interaction with p50. These findings support the conclusion that C/EBPs activate NF-κB target genes via contact with p50 even in the absence of canonical NF-κB activation and indicate that targeting C/EBP:p50 rather than C/EBP:p65 interaction in the nucleus will prove effective for inflammatory or malignant conditions, alone or synergistically with agents acting in the cytoplasm to reduce canonical NF-κB activation

    Comment on "On the subtleties of searching for dark matter with liquid xenon detectors"

    Full text link
    In a recent manuscript (arXiv:1208.5046) Peter Sorensen claims that XENON100's upper limits on spin-independent WIMP-nucleon cross sections for WIMP masses below 10 GeV "may be understated by one order of magnitude or more". Having performed a similar, though more detailed analysis prior to the submission of our new result (arXiv:1207.5988), we do not confirm these findings. We point out the rationale for not considering the described effect in our final analysis and list several potential problems with his study.Comment: 3 pages, no figure

    Neural cytoskeleton capabilities for learning and memory

    Get PDF
    This paper proposes a physical model involving the key structures within the neural cytoskeleton as major players in molecular-level processing of information required for learning and memory storage. In particular, actin filaments and microtubules are macromolecules having highly charged surfaces that enable them to conduct electric signals. The biophysical properties of these filaments relevant to the conduction of ionic current include a condensation of counterions on the filament surface and a nonlinear complex physical structure conducive to the generation of modulated waves. Cytoskeletal filaments are often directly connected with both ionotropic and metabotropic types of membrane-embedded receptors, thereby linking synaptic inputs to intracellular functions. Possible roles for cable-like, conductive filaments in neurons include intracellular information processing, regulating developmental plasticity, and mediating transport. The cytoskeletal proteins form a complex network capable of emergent information processing, and they stand to intervene between inputs to and outputs from neurons. In this manner, the cytoskeletal matrix is proposed to work with neuronal membrane and its intrinsic components (e.g., ion channels, scaffolding proteins, and adaptor proteins), especially at sites of synaptic contacts and spines. An information processing model based on cytoskeletal networks is proposed that may underlie certain types of learning and memory
    corecore