56,200 research outputs found

    Modelling conditional probabilities with Riemann-Theta Boltzmann Machines

    Full text link
    The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units. We derive that the corresponding conditional density function is given by a reparameterization of the Riemann-Theta Boltzmann machine modelling the original probability density function. Therefore the conditional densities can be directly inferred from the Riemann-Theta Boltzmann machine.Comment: 7 pages, 3 figures, in proceedings of the 19th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2019

    Riemann-Theta Boltzmann Machine

    Full text link
    A general Boltzmann machine with continuous visible and discrete integer valued hidden states is introduced. Under mild assumptions about the connection matrices, the probability density function of the visible units can be solved for analytically, yielding a novel parametric density function involving a ratio of Riemann-Theta functions. The conditional expectation of a hidden state for given visible states can also be calculated analytically, yielding a derivative of the logarithmic Riemann-Theta function. The conditional expectation can be used as activation function in a feedforward neural network, thereby increasing the modelling capacity of the network. Both the Boltzmann machine and the derived feedforward neural network can be successfully trained via standard gradient- and non-gradient-based optimization techniques.Comment: 29 pages, 11 figures, final version published in Neurocomputin
    corecore