53,141 research outputs found

    Flexible Mixture Modeling with the Polynomial Gaussian Cluster-Weighted Model

    Full text link
    In the mixture modeling frame, this paper presents the polynomial Gaussian cluster-weighted model (CWM). It extends the linear Gaussian CWM, for bivariate data, in a twofold way. Firstly, it allows for possible nonlinear dependencies in the mixture components by considering a polynomial regression. Secondly, it is not restricted to be used for model-based clustering only being contextualized in the most general model-based classification framework. Maximum likelihood parameter estimates are derived using the EM algorithm and model selection is carried out using the Bayesian information criterion (BIC) and the integrated completed likelihood (ICL). The paper also investigates the conditions under which the posterior probabilities of component-membership from a polynomial Gaussian CWM coincide with those of other well-established mixture-models which are related to it. With respect to these models, the polynomial Gaussian CWM has shown to give excellent clustering and classification results when applied to the artificial and real data considered in the paper

    Geodesics on the manifold of multivariate generalized Gaussian distributions with an application to multicomponent texture discrimination

    Get PDF
    We consider the Rao geodesic distance (GD) based on the Fisher information as a similarity measure on the manifold of zero-mean multivariate generalized Gaussian distributions (MGGD). The MGGD is shown to be an adequate model for the heavy-tailed wavelet statistics in multicomponent images, such as color or multispectral images. We discuss the estimation of MGGD parameters using various methods. We apply the GD between MGGDs to color texture discrimination in several classification experiments, taking into account the correlation structure between the spectral bands in the wavelet domain. We compare the performance, both in terms of texture discrimination capability and computational load, of the GD and the Kullback-Leibler divergence (KLD). Likewise, both uni- and multivariate generalized Gaussian models are evaluated, characterized by a fixed or a variable shape parameter. The modeling of the interband correlation significantly improves classification efficiency, while the GD is shown to consistently outperform the KLD as a similarity measure
    corecore