42,524 research outputs found
Deep Tree Transductions - A Short Survey
The paper surveys recent extensions of the Long-Short Term Memory networks to
handle tree structures from the perspective of learning non-trivial forms of
isomorph structured transductions. It provides a discussion of modern TreeLSTM
models, showing the effect of the bias induced by the direction of tree
processing. An empirical analysis is performed on real-world benchmarks,
highlighting how there is no single model adequate to effectively approach all
transduction problems.Comment: To appear in the Proceedings of the 2019 INNS Big Data and Deep
Learning (INNSBDDL 2019). arXiv admin note: text overlap with
arXiv:1809.0909
Flexible and practical modeling of animal telemetry data: hidden Markov models and extensions
We discuss hidden Markov-type models for fitting a variety of multistate random walks to wildlife movement data. Discrete-time hidden Markov models (HMMs) achieve considerable computational gains by focusing on observations that are regularly spaced in time, and for which the measurement error is negligible. These conditions are often met, in particular for data related to terrestrial animals, so that a likelihood-based HMM approach is feasible. We describe a number of extensions of HMMs for animal movement modeling, including more flexible state transition models and individual random effects (fitted in a non-Bayesian framework). In particular we consider so-called hidden semi-Markov models, which may substantially improve the goodness of fit and provide important insights into the behavioral state switching dynamics. To showcase the expediency of these methods, we consider an application of a hierarchical hidden semi-Markov model to multiple bison movement paths
Binary hidden Markov models and varieties
The technological applications of hidden Markov models have been extremely
diverse and successful, including natural language processing, gesture
recognition, gene sequencing, and Kalman filtering of physical measurements.
HMMs are highly non-linear statistical models, and just as linear models are
amenable to linear algebraic techniques, non-linear models are amenable to
commutative algebra and algebraic geometry.
This paper closely examines HMMs in which all the hidden random variables are
binary. Its main contributions are (1) a birational parametrization for every
such HMM, with an explicit inverse for recovering the hidden parameters in
terms of observables, (2) a semialgebraic model membership test for every such
HMM, and (3) minimal defining equations for the 4-node fully binary model,
comprising 21 quadrics and 29 cubics, which were computed using Grobner bases
in the cumulant coordinates of Sturmfels and Zwiernik. The new model parameters
in (1) are rationally identifiable in the sense of Sullivant, Garcia-Puente,
and Spielvogel, and each model's Zariski closure is therefore a rational
projective variety of dimension 5. Grobner basis computations for the model and
its graph are found to be considerably faster using these parameters. In the
case of two hidden states, item (2) supersedes a previous algorithm of
Schonhuth which is only generically defined, and the defining equations (3)
yield new invariants for HMMs of all lengths . Such invariants have
been used successfully in model selection problems in phylogenetics, and one
can hope for similar applications in the case of HMMs
- …