27,684 research outputs found

    Spectral dimensionality reduction for HMMs

    Get PDF
    Hidden Markov Models (HMMs) can be accurately approximated using co-occurrence frequencies of pairs and triples of observations by using a fast spectral method in contrast to the usual slow methods like EM or Gibbs sampling. We provide a new spectral method which significantly reduces the number of model parameters that need to be estimated, and generates a sample complexity that does not depend on the size of the observation vocabulary. We present an elementary proof giving bounds on the relative accuracy of probability estimates from our model. (Correlaries show our bounds can be weakened to provide either L1 bounds or KL bounds which provide easier direct comparisons to previous work.) Our theorem uses conditions that are checkable from the data, instead of putting conditions on the unobservable Markov transition matrix

    Parameter identifiability of discrete Bayesian networks with hidden variables

    Full text link
    Identifiability of parameters is an essential property for a statistical model to be useful in most settings. However, establishing parameter identifiability for Bayesian networks with hidden variables remains challenging. In the context of finite state spaces, we give algebraic arguments establishing identifiability of some special models on small DAGs. We also establish that, for fixed state spaces, generic identifiability of parameters depends only on the Markov equivalence class of the DAG. To illustrate the use of these results, we investigate identifiability for all binary Bayesian networks with up to five variables, one of which is hidden and parental to all observable ones. Surprisingly, some of these models have parameterizations that are generically 4-to-one, and not 2-to-one as label swapping of the hidden states would suggest. This leads to interesting difficulties in interpreting causal effects.Comment: 23 page

    A statistical multiresolution approach for face recognition using structural hidden Markov models

    Get PDF
    This paper introduces a novel methodology that combines the multiresolution feature of the discrete wavelet transform (DWT) with the local interactions of the facial structures expressed through the structural hidden Markov model (SHMM). A range of wavelet filters such as Haar, biorthogonal 9/7, and Coiflet, as well as Gabor, have been implemented in order to search for the best performance. SHMMs perform a thorough probabilistic analysis of any sequential pattern by revealing both its inner and outer structures simultaneously. Unlike traditional HMMs, the SHMMs do not perform the state conditional independence of the visible observation sequence assumption. This is achieved via the concept of local structures introduced by the SHMMs. Therefore, the long-range dependency problem inherent to traditional HMMs has been drastically reduced. SHMMs have not previously been applied to the problem of face identification. The results reported in this application have shown that SHMM outperforms the traditional hidden Markov model with a 73% increase in accuracy

    Algebraic Geometry of Matrix Product States

    Full text link
    We quantify the representational power of matrix product states (MPS) for entangled qubit systems by giving polynomial expressions in a pure quantum state's amplitudes which hold if and only if the state is a translation invariant matrix product state or a limit of such states. For systems with few qubits, we give these equations explicitly, considering both periodic and open boundary conditions. Using the classical theory of trace varieties and trace algebras, we explain the relationship between MPS and hidden Markov models and exploit this relationship to derive useful parameterizations of MPS. We make four conjectures on the identifiability of MPS parameters

    Quantum Hidden Markov Models based on Transition Operation Matrices

    Full text link
    In this work, we extend the idea of Quantum Markov chains [S. Gudder. Quantum Markov chains. J. Math. Phys., 49(7), 2008] in order to propose Quantum Hidden Markov Models (QHMMs). For that, we use the notions of Transition Operation Matrices (TOM) and Vector States, which are an extension of classical stochastic matrices and probability distributions. Our main result is the Mealy QHMM formulation and proofs of algorithms needed for application of this model: Forward for general case and Vitterbi for a restricted class of QHMMs.Comment: 19 pages, 2 figure

    Tropical Geometry of Statistical Models

    Get PDF
    This paper presents a unified mathematical framework for inference in graphical models, building on the observation that graphical models are algebraic varieties. From this geometric viewpoint, observations generated from a model are coordinates of a point in the variety, and the sum-product algorithm is an efficient tool for evaluating specific coordinates. The question addressed here is how the solutions to various inference problems depend on the model parameters. The proposed answer is expressed in terms of tropical algebraic geometry. A key role is played by the Newton polytope of a statistical model. Our results are applied to the hidden Markov model and to the general Markov model on a binary tree.Comment: 14 pages, 3 figures. Major revision. Applications now in companion paper, "Parametric Inference for Biological Sequence Analysis
    corecore