79,330 research outputs found

    Maximum Likelihood Estimation of Parameters in a Mixture Model

    Get PDF
    The estimation of parameters of the log normal distribution based on complete and censored samples are considered in the literature. In this article, the problem of estimating the parameters of log normal mixture model is considered. The Expectation Maximization algorithm is used to obtain maximum likelihood estimators for the parameters, as the likelihood equation does not yield closed form expression. The standard errors of the estimates are obtained. The methodology developed here is then illustrated through simulation studies. The confidence interval based on large-sample theory is obtained

    A Multivariate Generalized Orthogonal Factor GARCH Model

    Get PDF
    The paper studies a factor GARCH model and develops test procedures which can be used to test the number of factors needed to model the conditional heteroskedasticity in the considered time series vector. Assuming normally distributed errors the parameters of the model can be straightforwardly estimated by the method of maximum likelihood. Inefficient but computationally simple preliminary estimates are first obtained and used as initial values to maximize the likelihood function. Maximum likelihood estimation with nonnormal errors is also straightforward. Motivated by the empirical application of the paper a mixture of normal distributions is considered. An interesting feature of the implied factor GARCH model is that some parameters of the conditional covariance matrix which are not identifiable in the case of normal errors become identifiable when the mixture distribution is used. As an empirical example we consider a system of four exchange rate return series.Multivariate GARCH model; mixture of normal distributions; exchange rate

    A Tight Convex Upper Bound on the Likelihood of a Finite Mixture

    Full text link
    The likelihood function of a finite mixture model is a non-convex function with multiple local maxima and commonly used iterative algorithms such as EM will converge to different solutions depending on initial conditions. In this paper we ask: is it possible to assess how far we are from the global maximum of the likelihood? Since the likelihood of a finite mixture model can grow unboundedly by centering a Gaussian on a single datapoint and shrinking the covariance, we constrain the problem by assuming that the parameters of the individual models are members of a large discrete set (e.g. estimating a mixture of two Gaussians where the means and variances of both Gaussians are members of a set of a million possible means and variances). For this setting we show that a simple upper bound on the likelihood can be computed using convex optimization and we analyze conditions under which the bound is guaranteed to be tight. This bound can then be used to assess the quality of solutions found by EM (where the final result is projected on the discrete set) or any other mixture estimation algorithm. For any dataset our method allows us to find a finite mixture model together with a dataset-specific bound on how far the likelihood of this mixture is from the global optimum of the likelihoodComment: icpr 201

    Finite Impulse Response Errors-in-Variables system identification utilizing Approximated Likelihood and Gaussian Mixture Models

    Get PDF
    In this paper a Maximum likelihood estimation algorithm for Finite Impulse Response Errors-in-Variables systems is developed. We consider that the noise-free input signal is Gaussian-mixture distributed. We propose an Expectation-Maximization-based algorithm to estimate the system model parameters, the input and output noise variances, and the Gaussian mixture noise-free input parameters. The benefits of our proposal are illustrated via numerical simulation

    High dimensional Sparse Gaussian Graphical Mixture Model

    Full text link
    This paper considers the problem of networks reconstruction from heterogeneous data using a Gaussian Graphical Mixture Model (GGMM). It is well known that parameter estimation in this context is challenging due to large numbers of variables coupled with the degeneracy of the likelihood. We propose as a solution a penalized maximum likelihood technique by imposing an l1l_{1} penalty on the precision matrix. Our approach shrinks the parameters thereby resulting in better identifiability and variable selection. We use the Expectation Maximization (EM) algorithm which involves the graphical LASSO to estimate the mixing coefficients and the precision matrices. We show that under certain regularity conditions the Penalized Maximum Likelihood (PML) estimates are consistent. We demonstrate the performance of the PML estimator through simulations and we show the utility of our method for high dimensional data analysis in a genomic application

    A mixture logistic model for panel data with a Markov structure

    Full text link
    In this study, we propose a mixture logistic regression model with a Markov structure, and consider the estimation of model parameters using maximum likelihood estimation. We also provide a forward type variable selection algorithm to choose the important explanatory variables to reduce the number of parameters in the proposed model.Comment: Some results of this study have been included in the report of a research project of Professor Yu-Hsiang Cheng, and the report is now available. Thus we add the information in this versio
    • …
    corecore