1 research outputs found

    Approximate Learning in Temporal Hidden Hopfield Models

    No full text
    Abstract. Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low-dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined by parallel dynamics of densely connected high-dimensional stochastic Hopfield networks. For these Hidden Hopfield Models (HHMs), mean field methods are derived for learning discrete and continuous temporal sequences. We discuss applications of HHMs to classification and reconstruction of nonstationary time series. We also demonstrate a few problems (e.g. learning of incomplete binary sequences and reconstruction of 3D occupancy graphs) where distributed discrete hidden space representation may be useful. 1 Markovian Dynamics for Temporal Sequences Dynamic Bayesian networks are popular tools for modeling temporally correlated patterns. Included in this class of models are Hidden Markov Models (HMMs), auto-regressive HMMs (see e.g. Rabiner, 1989), and Factorial HMMs (Ghahramani and Jordan, 1995). These models are special cases of a generalized Markov chain p({h}, {v}) = p(h (0))p(v (0) T βˆ’
    corecore