96,082 research outputs found
Hidden Markov models: Estimation theory and economic applications
In this thesis, maximum likelihood estimation of hidden Markov models in several settings is investigated. Nonparametric estimation of state-dependent general mixtures and log-concave densities is discussed theoretically and algorithmically. Penalized estimation for parametric hidden Markov models comparing several penalty functions is studied. In addition, various models based on mixture models and hidden Markov models differing in dependency structure and the inclusion of covariables are applied to a set of panel data containing the GDP of several countries
Spectral Methods for Learning Multivariate Latent Tree Structure
This work considers the problem of learning the structure of multivariate
linear tree models, which include a variety of directed tree graphical models
with continuous, discrete, and mixed latent variables such as linear-Gaussian
models, hidden Markov models, Gaussian mixture models, and Markov evolutionary
trees. The setting is one where we only have samples from certain observed
variables in the tree, and our goal is to estimate the tree structure (i.e.,
the graph of how the underlying hidden variables are connected to each other
and to the observed variables). We propose the Spectral Recursive Grouping
algorithm, an efficient and simple bottom-up procedure for recovering the tree
structure from independent samples of the observed variables. Our finite sample
size bounds for exact recovery of the tree structure reveal certain natural
dependencies on underlying statistical and structural properties of the
underlying joint distribution. Furthermore, our sample complexity guarantees
have no explicit dependence on the dimensionality of the observed variables,
making the algorithm applicable to many high-dimensional settings. At the heart
of our algorithm is a spectral quartet test for determining the relative
topology of a quartet of variables from second-order statistics
Markov-switching generalized additive models
We consider Markov-switching regression models, i.e. models for time series
regression analyses where the functional relationship between covariates and
response is subject to regime switching controlled by an unobservable Markov
chain. Building on the powerful hidden Markov model machinery and the methods
for penalized B-splines routinely used in regression analyses, we develop a
framework for nonparametrically estimating the functional form of the effect of
the covariates in such a regression model, assuming an additive structure of
the predictor. The resulting class of Markov-switching generalized additive
models is immensely flexible, and contains as special cases the common
parametric Markov-switching regression models and also generalized additive and
generalized linear models. The feasibility of the suggested maximum penalized
likelihood approach is demonstrated by simulation and further illustrated by
modelling how energy price in Spain depends on the Euro/Dollar exchange rate
Bayesian Nonparametric Hidden Semi-Markov Models
There is much interest in the Hierarchical Dirichlet Process Hidden Markov
Model (HDP-HMM) as a natural Bayesian nonparametric extension of the ubiquitous
Hidden Markov Model for learning from sequential and time-series data. However,
in many settings the HDP-HMM's strict Markovian constraints are undesirable,
particularly if we wish to learn or encode non-geometric state durations. We
can extend the HDP-HMM to capture such structure by drawing upon
explicit-duration semi-Markovianity, which has been developed mainly in the
parametric frequentist setting, to allow construction of highly interpretable
models that admit natural prior information on state durations.
In this paper we introduce the explicit-duration Hierarchical Dirichlet
Process Hidden semi-Markov Model (HDP-HSMM) and develop sampling algorithms for
efficient posterior inference. The methods we introduce also provide new
methods for sampling inference in the finite Bayesian HSMM. Our modular Gibbs
sampling methods can be embedded in samplers for larger hierarchical Bayesian
models, adding semi-Markov chain modeling as another tool in the Bayesian
inference toolbox. We demonstrate the utility of the HDP-HSMM and our inference
methods on both synthetic and real experiments
Autoregressive hidden Markov model with application in an El Niño study
Hidden Markov models are extensions of Markov models where each observation is the result of a stochastic process in one of several unobserved states. Though favored by many scientists because of its unique and applicable mathematical structure, its independence assumption between the consecutive observations hampered further application. Autoregressive hidden Markov model is a combination of autoregressive time series and hidden Markov chains. Observations are generated by a few autoregressive time series while the switches between each autoregressive time series are controlled by a hidden Markov chain. In this thesis, we present the basic concepts, theory and associated approaches and algorithms for hidden Markov models, time series and autoregressive hidden Markov models. We have also built a bivariate autoregressive hidden Markov model on the temperature data from the Pacific Ocean to understand the mechanism of El
Nino. The parameters and the state path of the model are estimated through the Segmental K-mean algorithm and the state estimations of the autoregressive hidden Markov model have been compared with the estimations from a conventional hidden Markov model. Overall, the results confirm the strength of the autoregressive hidden Markov models in the El Nino study and the research sets an example of ARHMM's application in the meteorology
Financial signal processing: a self calibrating model
Previous work on multifactor term structure models has proposed that the short rate process is a function of some unobserved diffusion process. We consider a model in which the short rate process is a function of a Markov chain which represents the 'state of the world'. This enables us to obtain explicit expressions for the prices of zero-coupon bonds and other securities. Discretizing our model allows the use of signal processing techniques from Hidden Markov Models. This means we can estimate not only the unobserved Markov chain but also the parameters of the model, so the model is self-calibrating. The estimation procedure is tested on a selection of U.S. Treasury bills and bonds.Bonds
- …